Car Steered with Eyes Alone
BERLIN, May 4, 2010 — Scientists at Freie University, working under the direction of computer science professor Raúl Rojas, gave a completely new meaning to the standard rule for drivers, “Keep your eyes on the road!” Using software they developed, they can steer a car with their eyes.
On the site of the former Berlin Tempelhof Airport, Rojas and his team from the Artificial Intelligence Group demonstrated how they can steer a vehicle equipped with complex electronics just by eye. More than 60 journalists from around the world were there to watch.
The eyeDriver software is a prototype application for steering the research vehicle Spirit of Berlin using eye movements. The software was designed by computer scientists at Freie in collaboration with the company SMI (SensoMotoric Instruments GmbH). The eye movements of the driver are collected and converted into control signals for the steering wheel. The speed is controlled separately and is not included in eyeDriver. The software shows that you can drive a vehicle alone with eye movements.
The Spirit of Berlin is a 2000 Grand Caravan equipped with a suite of sensors and software that enable it to drive semiautonomously, steering solely by the direction its driver gazes.
SMI’s HED4 hardware package is used for detecting and tracking the eye movements. It is a converted bicycle helmet equipped with two cameras and an infrared LED, as well as a laptop computer with special software. One of the cameras is pointed to the front in the same direction as the person wearing the helmet (scene camera), while the other camera films one eye of the wearer (eye camera). The infrared light supports the eye camera and is pointed to the eye under observation. A transparent mirror that reflects only the infrared light is used to allow a reasonable viewing angle for the eye camera, without limiting the wearer’s ability to see. After a brief calibration, the software on the laptop of the HED4 is not only able to capture the position of the pupil in the eye camera, but can also calculate the position in the scene camera that the wearer is looking at.
These coordinates in the image of the scene camera (viewing position) are transmitted via an ordinary LAN to the onboard computer in the Spirit of Berlin. The eyeDriver software in the onboard computer in the Spirit of Berlin receives the viewing positions at regular intervals over the LAN in the vehicle and uses it to control the steering wheel. The driver can choose between two modes: “free ride” and “routing.”
In the “free ride” mode, the viewing positions are linked directly with the steering wheel motor. That means that the x-coordinates of the viewing position are used to calculate the desired position of the steering wheel. The further the driver looks to the left or right, the further the steering wheel is turned in that direction. The speed of the vehicle is set in advance and kept constant, as long as the position of the gaze is recognized. In case it is not possible to detect which direction the driver is looking in – for example, if the driver’s eyes are closed – the vehicle brakes automatically.
In the “routing” mode, the Spirit of Berlin steers autonomously most of the time. Only where there is a fork in the road or an intersection, the car stops and asks the driver to select the next route. This requires the wearer of the helmet to look to the left or right for three seconds. If the driver’s gaze lingers long enough in one direction, the eyeDriver software confirms acoustically that the choice has been accepted. The decision is communicated to the planner in the vehicle. Then the artificial intelligence in the Spirit of Berlin can plan the route accordingly and continue to run independently.
The autoNomous vehicle project
Dr. Rojas is a professor of artificial intelligence at the Institute of Computer Science at Freie University Berlin. He gained international success with his soccer robots, the “FU-Fighters.” They were world champions twice in the small-size league. Since 2006, Rojas and his team have been designing technologies related to autonomous vehicles. As part of this project, they developed the research vehicle Spirit of Berlin, making it to the semifinals in the DARPA Urban Challenge in California in 2007.
In the fall of 2009, in the innovative vehicle steering series, based on the test vehicle, the computer scientists Tinosch Ganjineh and Miao Wang developed iDriver, with which it is possible to steer the research car using an iPhone. This series is now complemented with the eyeDriver software. It was developed primarily by Wang and David Latotzky in cooperation with SMI. These two developments are simply subprojects; the core of the research continues to be the autonomous driving.
The team around Rojas has been working on the further development of autonomous or semiautonomous cars in the AutoNomos project headed by Ganjineh. The project is being funded by the German Ministry of Education and Research in its ForMaT (Forschung für den Markt im Team) program. The funding is for two years. The project will make a significant contribution to the development of accident-free, efficient and environmentally friendly mobility. AutoNomos is a modular system for the operation of autonomous or semiautonomous cars. Using AutoNomos, it will be possible to detect impending dangers on roads, highways, and crossings (lane change, traffic jams, rights of way) at an early stage and accidents can be prevented. Once the technology is ready, it will be introduced at first on private property and, finally, in public traffic.
For more information, visit: autonomos.inf.fu-berlin.de
- artificial intelligence
- The ability of a machine to perform certain complex functions normally associated with human intelligence, such as judgment, pattern recognition, understanding, learning, planning and problem solving.
MORE FROM PHOTONICS MEDIA