Search
Menu
QPC Lasers Inc. - QPC Lasers is LIDAR 4-24 LB

Photonic Components Set Course for Autonomous Driving

Facebook X LinkedIn Email
Sana Amairi-Pyka, European Photonics Industry Consortium (EPIC)

Over the last few years, carmakers have been relentlessly pursuing “the car of the future” — one that is electric, autonomous, and connected, and that contains a full range of advanced driver assistance systems (ADAS). This column looks at the motivations and challenges for lidar and the remarkable advancements in photonics and lighting technologies that are helping to make the car of the future a reality.

According to a recent market report, the global ADAS market is forecast to grow from $44 billion in sales in 2020 to just under $95 billion in 2024, a compound annual growth rate of 21%1. The production of photonic technologies that have helped propel autonomous driving forward will expand along with it.

One of the main factors that is driving demand for ADAS is the need for safety — not only because ADAS technologies can protect drivers, passengers, and pedestrians from injury but because many car safety systems are expected to become mandatory, based on pending regulations. For example, the European Union (EU) is likely to make advanced emergency braking systems (AEBS), lane-keeping assistance systems, and reverse cameras mandatory this year.

Lidar systems in use

Light detection and ranging (lidar) sensors are well suited for autonomous systems because of the accuracy of the sensors in determining distance and because of their angular resolution — capabilities that are unmatched in comparison to that of traditional cameras and radar. All of the collected data is processed on a chip, making the system cost-effective to reproduce in large quantities, due to the lack of a need for expensive hardware. These solid-state sensors are set in silicon, so they have no moving parts, which makes them more reliable in their functioning and less expensive to maintain. And their small size makes them easier to integrate into any part of the vehicle. These advantages have made lidar Audi’s technology of choice for the ADAS in its premium models, for example.

One disadvantage of lidar when incorporating it into autonomous driving systems is that its resolution differs from that of a 2D camera. It also lacks the ability that radar has to see in bad weather. Lidar requires a huge amount of processing power to interpret up to 1 million measurements per second and then translate the measurements into actionable data. Lidar is approximately 10× costlier than an ADAS mono camera, which is the most common alternative. Manufacturers of lidar would benefit from a set of industry standards or common specifications enabling them to measure all situations, such as driving on a highway or in a city. In the case of ADAS, the variety of use cases also makes manufacturing costs fluctuate.

According to Gunnar Moos, head of autonomous driving at OSRAM, lidar for ADAS will not be successful unless it is incorporated into the designs of volume car manufacturers. For this to happen, three main challenges need to be overcome: high price, the lack of standards, and the complexity that comes from the diverse uses of the technology.

For these reasons, OSRAM is developing a flexible platform (Figure 1) that can accommodate varying requirements for range, resolution, and field of view, according to the various uses for lidar on the road, enabling the company to move into volume production for midrange cars and potentially reduce the price of incorporating such a platform.

Figure 1. The lidar percept platform for flexible automotive long- and mid-range applications. Courtesy of OSRAM.


Figure 1. The lidar percept platform for flexible automotive long- and mid-range applications. Courtesy of OSRAM.

Trusting autonomous driving

Building trust in ADAS, which is subject to heavy regulation from governments, is a fundamental challenge for car manufacturers. To this end, Audi (among other auto companies) is working on optimizing displays and symbols to communicate warnings and information regarding the position and movement of the vehicle as it relates to pedestrians, cyclists, and other drivers (Figure 2). Responses from focus groups indicate that white-colored symbols caught people’s attention better than most other colors, including blue-green. The main challenge on the road is that, in bright sunlight, drivers cannot see the organic light-emitting diodes (OLEDs) that are in use in displays. There is also the need to incorporate acoustic signals into the system for visually impaired pedestrians, and lidar and camera technologies are required to sense the environment for the safety of the driver and other people in the vicinity.

Figure 2. An Audi Aicon C2 with its pedestrian warning light displayed. Courtesy of Audi.


Meadowlark Optics - Building system MR 7/23

Figure 2. An Audi Aicon C2 with its pedestrian warning light displayed. Courtesy of Audi.

Processes that are vital to the effectiveness of lidar for ADAS are data fusion and the training of neural networks — in particular, the fusion of 3D information obtained from both the lidar and RGB camera images. Beamagine is working on a solid-state lidar system with embedded data fusion that will be effective in all types of autonomous vehicles and be able to deliver very high resolution, down to 0.05º with a long 100- to 150-m range (Figure 3). The device works with any type of camera, including thermal, SWIR, MIR, RGB, polarimetric, and neuromorphic.

Figure 3. A multimodal imaging lidar to be released this year. Courtesy of Beamagine.


Figure 3. A multimodal imaging lidar to be released this year. Courtesy of Beamagine.

Jenoptik has created some of the first mass-produced lidar lenses for driver recognition and lane departure warning systems. The company has used hybrid glass and plastic freeform surface systems to redesign conventional optic structures and create monolithic structures that are 4× smaller. The systems are cheaper, more robust, thermally stable, and easier to assemble compared to using symmetric lenses. Assembly is easier because the elements need no alignment.

A company’s first lidar chip is often designed without taking into account the packaging. This oversight results in high manufacturing costs. To overcome this issue, PHIX Photonics Assembly works through the EU’s PIXAPP consortium, which has established a common set of rules for designing the packaging of the integrated circuit to increase the chances of success at the early stage of production.

Lidar systems require testing and monitoring of the laser and optical components to ensure maximum performance. PHASICS wavefront sensors can measure phases and intensities and provide a complete laser beam qualification. Also, dedicated testing stations perform quality control on lidar filters and optical components (Figure 4).

Figure 4. The Kaleo MultiWAVE test station that tests optical elements at their designed wavelength. Courtesy of PHASICS.


Figure 4. The Kaleo MultiWAVE test station that tests optical elements at their designed wavelength. Courtesy of PHASICS.

SAND MicroSystems is a consulting and service company providing solutions to improve the performance of lidar product modules (i.e., the receive, transmit, and power management modules) and the front-end amplifier. For each module, advice can be given regarding challenges such as photonic noise, responsivity, and linearity; keeping the diode wavelengths constant; and generating various reliable data on the chip or board.

Future of ADAS

Over the next four years, the global camera and lidar ADAS markets are forecast to reach over $10 billion in sales1. This growth will provide exciting opportunities for photonics companies to develop ADAS technologies and find solutions to several challenges. These solutions include the development of systems for building trust in automated and autonomous driving, such as the communication systems being developed by Audi. They also include solutions for testing, packaging, data fusion, and standardization — by using a one-module approach that’s flexible enough to accommodate future standards.

Meet the author

Sana Amairi-Pyka, Ph.D., is an experimental physicist specializing in quantum optical metrology, and a project leader at the European Photonics Industry Consortium (EPIC). She works mainly on innovations in optical manufacturing technologies and the quantum photonics industry. Among her responsibilities are the dissemination and communication of the Horizon 2020-funded European PHABULOµS project, the new pilot line providing highly advanced and robust manufacturing technology for free-form micro-optics and microstructures. Amairi-Pyka obtained her doctorate in optical metrology and frequency standards at the National Metrology Institute of Germany and graduated from the University of Hanover; email: [email protected].

Acknowledgment

PIXAPP is the world’s first open-access photonic integrated circuit (PIC) assembly and packaging pilot line. It has received funding from the EU’s Horizon 2020 research and innovation program under Grant Agreement No. 731954.

Reference

1. Technavio (2019). Automotive Advanced Driver Assistance System (ADAS) Market by Application, Technology, and Geography — Forecast and Analysis 2020-2024.

Published: January 2021
Glossary
lidar
Lidar, short for light detection and ranging, is a remote sensing technology that uses laser light to measure distances and generate precise, three-dimensional information about the shape and characteristics of objects and surfaces. Lidar systems typically consist of a laser scanner, a GPS receiver, and an inertial measurement unit (IMU), all integrated into a single system. Here is how lidar works: Laser emission: A laser emits laser pulses, often in the form of rapid and repetitive laser...
ADASlidarAEBSEPICOsramOLEDsBeamagineJenoptikPhasicsPHIXSand MicrosystemsEPIC Insights

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.