Search Menu
Photonics Media Photonics Marketplace Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

Researchers Fool Autonomous Vehicle Systems with Phantom Images

Facebook Twitter LinkedIn Email
Researchers from Ben-Gurion University of the Negev’s Cyber Security Research Center found that they can trick the autopilot on an autonomous vehicle to erroneously apply its brakes in response to “phantom” images projected on a road or billboard.

The researchers demonstrated that autopilots and advance driving-assistance systems in semi-autonomous or fully autonomous cars register depthless projections of objects (phantoms) as real objects. They show how attackers can exploit this perceptual challenge to manipulate the vehicle and potentially harm the driver or passengers without any special expertise by using a commercial drone and inexpensive image projector.
In the Ben-Gurion University of the Negev Research Tesla considers the phantom image (left) as a real person and (right) Mobileye 630 PRO autonomous vehicle system considers the image projected on a tree as a real road sign. Courtesy of BGU Negev Cyber Security Research Center.
In the Ben-Gurion University of the Negev research, a Tesla considers the phantom image to be a real person (left), and a Mobileye 630 PRO autonomous vehicle system considers the image projected on a tree to be a real road sign (right). Courtesy of BGU Negev Cyber Security Research Center.

Though fully and semi-autonomous cars are already being deployed around the world, vehicular communication systems that connect the car with other cars, pedestrians, and surrounding infrastructure are lagging. According to the researchers, the lack of such systems creates a “validation gap,” which prevents the autonomous vehicles from validating their virtual perception with a third party. Instead, they are relying only on internal sensors.

In addition to causing the autopilot to apply brakes, the researchers demonstrated they can fool the ADAS into believing phantom traffic signals are real when projected for 125 ms in advertisements on digital billboards. Lastly they showed how fake lane markers projected on a road by a projector-equipped drone will guide the autopilot into the opposite lane and potentially into oncoming traffic.

“This type of attack is currently not being taken into consideration by the automobile industry,” lead author Ben Nassi said. “These are not bugs or poor coding errors, but fundamental flaws in object detectors that are not trained to distinguish between real and fake objects and use feature matching to detect visual objects.”

In reality, depthless objects projected on a road are considered real even though the depth sensors can differentiate between 3D and 2D. The researchers believe this is the result of a “better safe than sorry” policy that causes the car to consider a visual 2D object real.

The researchers are developing a neural network model that analyzes a detected object’s context, surface, and reflected light, which is capable of detecting phantoms with high accuracy.
Mar 2020
machine vision
Interpretation of an image of an object or scene through the use of optical noncontact sensing mechanisms for the purpose of obtaining information and/or controlling machines or processes.
Research & TechnologyAutonomous drivingmachine visionSelf-driving carsself-driving vehiclesBen-Gurion UniversityBen-Gurion University of the NegevThe News Wire

Submit a Feature Article Submit a Press Release
Terms & Conditions Privacy Policy About Us Contact Us
Facebook Twitter Instagram LinkedIn YouTube RSS
©2023 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.