Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Researchers Improve ToF Resolution 1000-Fold Using Computational Approach

In a novel approach to time-of-flight (ToF) imaging, researchers correlated the principle of heterodyning to ToF, increasing depth resolution 1000-fold. Higher ToF resolution could enable accurate distance measurements through fog, removing a major obstacle to the development of self-driving cars.

At a range of 2 m, existing ToF systems have a depth resolution of about 1 cm. In contrast, a new ToF system developed by a team at MIT has a depth resolution of 3 µm at distances of 2 m.


MIT researchers present a new scheme of heterodyne lidar that uses a cascaded stack of telecommunication modulators to allow time of flight 3D imaging at micrometer range precision. Courtesy of Achuta Kadambi and Ramesh Raskar, MIT.

With ToF imaging, light-burst length is one factor that determines system resolution. A burst of light is fired into a scene, and a camera measures the time it takes to return, which indicates the distance of the object that is reflecting the light.

Detection rate also determines resolution. According to the researchers, today’s detectors can make only about 100 million measurements per second, which limits existing ToF systems to centimeter-scale resolution.

To improve resolution, the researchers looked at interferometry, in which a light beam is split in two and half of it is kept circulating locally while the other half is fired into a visual scene. The reflected sample beam is recombined with the locally circulated light, and the difference in phase between the two beams yields a precise measurement of the distance that the sample beam has traveled. However, interferometry requires careful synchronization of the two light beams — a difficult achievement for a moving vehicle.

“You could never put interferometry on a car because it’s so sensitive to vibrations,” researcher Achuta Kadambi said. “We’re using some ideas from interferometry and some of the ideas from lidar, and we’re really combining the two here.”

Kadambi said that the MIT system also borrows from principles of acoustics. For example, if two voices are singing at different pitches, the interplay between these tones will produce a third tone, with a frequency that is the difference between the first two.

The same is true with light pulses. If a ToF imaging system is firing light into a scene at the rate of a billion pulses a second, and the returning light is combined with light pulsing 999,999,999 times a second, the result will be a light signal pulsing once a second — a rate easily detectable with a commodity video camera. That slow “beat” will contain all the phase information necessary to gauge distance.

Rather than trying to synchronize two high-frequency light signals, as an interferometry system would do, the team modulated the returning signal using the same technology that produced it in the first place — in other words, they pulsed the already-pulsed light.

“The fusion of the optical coherence and electronic coherence is very unique,” said professor Ramesh Raskar. “We’re modulating the light at a few gigahertz, so it’s like turning a flashlight on and off millions of times per second. But we’re changing that electronically, not optically. The combination of the two is really where you get the power for this system.”

Gigahertz optical systems are better at compensating for fog than lower-frequency systems. Fog is problematic for ToF systems because it deflects the returning light signals so that they arrive late and at odd angles. Trying to isolate a true signal when light is scattered is too computationally challenging to do on the fly, said the researchers.

With low-frequency systems, scattering causes a slight shift in phase. But with high-frequency systems, the phase shift is much larger relative to the frequency of the signal. Scattered light signals arriving over different paths will actually cancel each other out — the troughs of one wave will align with the crests of another. According to the researchers, theoretical analyses performed at the University of Wisconsin and Columbia University suggest that this cancellation could be widespread enough to make identifying a true signal much easier.

To test their approach, the team sent a light signal through 500 meters of optical fiber before feeding the signal to the MIT-developed system. The fiber had regularly spaced filters along its length to simulate the power falloff incurred over longer distances.

Results from the team’s experiment showed depth sensing at 3-µm resolution and at Hertz-level framerates, and robustness to extreme vibrations. The team believes that these results could be used to develop applications for obtaining high-quality 3D scans in uncontrolled environments, and that the study could provide a foundation for future work in computational imaging, where cascaded modulating elements are incorporated into the correlation ToF architecture.

The research was published in IEEE Explore (doi: 10.1109/ACCESS.2017.2775138).

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media