Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Sensor Fusion: The Automotive 6th Sense

VALERIE C. COFFEY, SCIENCE WRITER

Advanced vehicles today are full of sensors for monitoring motion, speed, position, pressure, temperature, moisture, and emissions. The number of sensors in cars and trucks will only continue to grow as the demand for advanced driver-assistance systems and autonomous vehicles increases. Analysts at Research and Markets predicted in the “Global Automotive Sensor Market Analysis and Forecast” that the automotive sensor market will grow at a compound annual growth rate of 7.4% through 2029. Automakers are betting on the rising demand, growing the sensor count in vehicles to enable the 360° mapping and redundancy required by driverless vehicles. When it comes to autonomous vehicle technology designed to replace human drivers, error is unacceptable when interpreting and acting on the surroundings of automated cars; indeed, the technology must be better than humans at managing safe driving decisions.

Photonics technologies will play an important role in helping car manufacturers and governments make the shift to increasingly automated vehicles. Cameras, lidar, optical sensors, photonic integrated circuits, and displays inside and outside of vehicles and along roadways will combine with radar, sonar, machine learning, and artificial intelligence (AI) to detect and recognize objects, assist drivers and passengers, and improve safety and comfort (Figure 1).



Figure 1. Continental and 3M have partnered to develop intelligent intersection infrastructure with sensors that will interface with those in vehicles. Courtesy of Continental AG.

Video cameras are by far the cheapest and most mature sensor technology for the range and resolution achieved when classifying objects and textures. For example, Sony’s electric Vision-S concept car, which was revealed at the 2020 Consumer Electronics Show (CES) in Las Vegas, has 33 sensors, 10 of which are 7.42-MP IMX324 CMOS image sensors able to detect road signs up to 160 m ahead. The European New Car Assessment Programme (Euro NCAP), a five-star safety rating system for consumers, has proposed safety regulations in Europe for model-year 2022 vehicles that will require cameras or sensors for reversing, detecting driver drowsiness and distraction, lane-keeping assistance, autonomous emergency braking, and blind-spot monitoring in trucks and buses. Vehicles without these technologies will receive lower safety ratings.

In the U.S., the National Highway Traffic Safety Administration (NHTSA), a division of the federal Department of Transportation, is responsible for issuing similar regulations, such as requiring rearview cameras in all new cars and light trucks under 10,000 lbs as of May 2018. The NHTSA 5-star safety rating system is similar to Europe’s, but has not yet mandated lane-keeping assistance or blind-spot detection. In both markets, consumer demand and economies of scale are pushing car companies to voluntarily incorporate numerous sensors anyway, far in advance of directives. In December, the NHTSA announced that 20 automakers had voluntarily committed to equipping new vehicles with low-speed automatic emergency braking and forward collision warning systems by September 2022. The systems use radar, cameras, and laser sensors to detect imminent crashes, warn the driver, and apply the brakes to reduce the severity of crashes or prevent them altogether.

In August, automotive supplier Continental AG of Hanover, Germany, announced its Road AND Driver camera for driver safety monitoring, in anticipation of the new Euro NCAP legislation (Figure 2). While an eye-safe infrared (IR) interior video camera inside the windshield continuously monitors the orientation of the driver’s face and hands, another coordinated digital video camera above the rear-view mirror monitors the external situation on the road. Automakers can design a custom system using a variety of cameras with resolutions ranging from 1 to 8 MP, as well a field of view of up to 125º for early cross-traffic detection. The vision system will not record data; rather, it is designed to continually evaluate the need for safe transfer of driving functions only when needed. Neural networks will handle the computer decisions, a prerequisite for future automated driving capabilities. Continental expects series production to start in 2021.



Figure 2. The Road AND Driver camera system will aim coordinated high-resolution, night-vision video cameras inside and outside of the vehicle, and monitor whether the advanced driver-assistance system should take over emergency functions such as braking in the case of driver distraction or fatigue. Courtesy of Continental AG.

Better together

Cameras are not much better than humans at sensing in poor visibility, such as with rain, fog, or glare. For this reason, external sensing cameras are often combined with radar, lidar, and sonar for additional context. Automotive radar systems typically work at 24 or 77 GHz, or submillimeter (terahertz) frequencies, to detect objects from <1 m to >250 m away. Radar with relatively inexpensive and reliable distance and velocity sensing can supplement cameras, even in poor environmental conditions. In fact, subterahertz wavelengths can cut through fog and dust with ease, which is quite reliable for keeping track of a car in front of you1. But radar systems in general are challenged by low sensitivity and resolution, so they aren’t useful for identification of smaller objects or signage.

Continental is using radar in its left-turn assist system, scheduled to be launched in agricultural machinery in Europe in 2021 (Figure 3). The company is also working on sensor fusion of radar and camera data for many of its driver-assistance systems. With left-turn assist, digital video camera information, combined with radar on the rear and left side mirrors, will warn drivers of passing vehicles when the field of vision is restricted. The company already offers right-turn assist for trucks, which will be mandatory for new vehicles in the EU by 2022.



Figure 3. The left-turn assist system for agricultural vehicles with limited field of vision will combine camera and radar information to detect passing vehicles in dangerous situations. Courtesy of Continental AG.

Another option to supplement video cameras in external sensing is the longwave IR (LWIR) camera, recording distant details of up to 50 km away at subvisual wavelengths from 8 to 14 µm. Whereas video cameras are weak at sensing in dark conditions, LWIR video can reveal nearly invisible pedestrians or animals based on heat signatures, even at night in a downpour, although the $3000 cost per unit is a barrier for widespread automotive use.

Lidar has the capability to complement radar with (in some cases) a longer range of up to 300 m, a higher-resolution 3D point-cloud map of location and velocity, as well as useful spectral information for identifying, for example, the difference between ice and water on the roadway, even at night, when cameras are challenged. Lidar is still expensive, but over 100 companies are focused on next- generation lidar technology that is expected to address this problem in the next few years. Numerous early-stage competitors are developing solid-state lidar systems that are leveraging chip-scale manufacturing to bring down cost2. Among them are Analog Photonics (Boston), Innoviz Technologies (Rosh Ha’ayin, Israel), Lumotive (Bellevue, Wash.), Quanergy (Sunnyvale, Calif.), and Sense Photonics (Research Triangle Park, N.C.). Legacy lidar company Velodyne (San Jose, Calif.), the developer of the spinning lidar systems on top of the first self- driving car prototypes, joined the solid-state club with its recent addition of the $100 solid-state Velabit sensor announced at CES 2020.

The complex list of sensor advantages and disadvantages helps explain why integrated AI-based smart systems are poised to be better equipped than humans at managing safe driving decisions. Combining the information from two or more sensors looking at even one quadrant around a vehicle produces massive quantities of data that are challenging to coordinate. In addition to reducing the size and cost of existing sensors, numerous companies are focused on creating smart sensor fusion platforms with algorithms that continually adapt, learn, and evolve to make faster, more reliable decisions. A sampling of companies pursuing sensor fusion in automotive applications includes Autonomous Solutions (Mendon, Utah), AEye (Pleasanton, Calif.), First Sensor (Dresden, Germany), Hesai Technology (Shanghai), Infineon (Neubiberg, Germany), and XenomatiX (Leuven, Belgium).

Gone with the wheel

In January at CES, German luxury vehicle brand Mercedes-Benz demonstrated a futuristic concept car inspired by the 2009 sci-fi movie “Avatar.” The car was designed in consultation with director James Cameron (Figure 4). The electric, automated Mercedes VISION AVTR (Advanced Vehicle Transformation) is designed to be sustainable and “fully intuitive,” recognizing the driver via biometric sensors, although the company has not revealed specifics on particular sensor technology. The steering wheel is replaced by a control element in the armrest that is activated by placement of the passenger’s palm. This console, along with a detector in the seatback, senses the passengers’ heartbeats and breathing (Figure 5). The car projects a menu onto the palm of a proffered hand, and features a wraparound, curved display monitor instead of a dashboard (Figure 6). Further quirky characteristics are the car’s transparent doors and the ability to drive sideways with its LED-laced wheels into a parking spot. Despite a demonstration on the Las Vegas Strip, the concept car is not scheduled for mass production any time soon.



Figure 4. Mercedes showcased its futuristic concept car, the VISION AVTR, at CES 2020 — inspired by “Avatar,” the James Cameron film. Courtesy of Mercedes Benz.



Figure 5. The interior of the Mercedes VISION AVTR concept car features biometric sensors in the seats and touch-responsive console elements that negate the need for a steering wheel. Courtesy of Mercedes Benz.

Whereas video cameras are weak at sensing in dark conditions, LWIR video can reveal nearly invisible pedestrians or animals based on heat signatures, even at night in a downpour.
With a more practical eye toward concept cars of the future, researchers at Middle East Technical University in Ankara, Turkey, have demonstrated an optomechanical impact sensor designed to detect automotive crashes faster and at lower cost than existing accelerometers3. The device involves a rod-shaped cavity of silicon photonic crystal sandwiched by gold-plate boundaries. Resonating light confined within the slot makes the cavity highly sensitive to displacement of the conducting boundary. Compared to commercially available devices, the sensor’s response time of 16.6 µm is much shorter, and the sensing area of 106.6 sq µm is much more compact. The research team, led by Serdar Kocaman, assistant professor of electrical engineering, hopes its sensor may someday assist in reducing casualties due to car accidents.



Figure 6. Simply positioning a hand inside the Mercedes VISION AVTR concept car will trigger a sensor to project a menu onto the palm. Courtesy of Mercedes Benz.

“Current CMOS fabrication technology, along with rapid advancements in photonic nanofabrication methods, have made designs like this easy to integrate into prevailing automotive electronics,” Kocaman said. “Our study shows that optomechanical approaches to automotive security and safety can be practical at a reasonable cost.” Automated vehicles of the future may incorporate many such novel sensors that — when fused into a single system — provide a sixth sense that will outperform humans.

References

1. R. Thakur (April 2018). Advances in IR technology and integration into automotive use. Presented at ARPA-E NEXTCR 2018 Annual Meeting, On-Board Sensors.

2. V. Coffey (Nov. 2019). High-tech lidar: the future looks fly, www.photonics.com/articles/high-tech_lidar_the_future_looks_fly/p5/v170/i1124/a65101.

3. O. Orsel et al. (Feb. 2020). A 2D slotted rod type PhC cavity inertial sensor design for impact sensing. IEEE J Quantum Electron, Vol. 56, No. 1, www.doi.org/10.1109/JQE.2019.2960134.

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media