Close

Search

Search Menu
Photonics Media Photonics Buyers' Guide Photonics EDU Photonics Spectra BioPhotonics EuroPhotonics Industrial Photonics Photonics Showcase Photonics ProdSpec Photonics Handbook
More News
share
Email Facebook Twitter Google+ LinkedIn Comments

Researchers Improve ToF Resolution 1000-Fold Using Computational Approach

Photonics.com
Dec 2017
CAMBRIDGE, Mass., Dec. 28, 2017 — In a novel approach to time-of-flight (ToF) imaging, researchers correlated the principle of heterodyning to ToF, increasing depth resolution 1000-fold. Higher ToF resolution could enable accurate distance measurements through fog, removing a major obstacle to the development of self-driving cars.

At a range of 2 m, existing ToF systems have a depth resolution of about 1 cm. In contrast, a new ToF system developed by a team at MIT has a depth resolution of 3 µm at distances of 2 m.

Computational approach to time of flight imaging, MIT Camera Culture Group.

MIT researchers present a new scheme of heterodyne lidar that uses a cascaded stack of telecommunication modulators to allow time of flight 3D imaging at micrometer range precision. Courtesy of Achuta Kadambi and Ramesh Raskar, MIT.

With ToF imaging, light-burst length is one factor that determines system resolution. A burst of light is fired into a scene, and a camera measures the time it takes to return, which indicates the distance of the object that is reflecting the light.

Detection rate also determines resolution. According to the researchers, today’s detectors can make only about 100 million measurements per second, which limits existing ToF systems to centimeter-scale resolution.

To improve resolution, the researchers looked at interferometry, in which a light beam is split in two and half of it is kept circulating locally while the other half is fired into a visual scene. The reflected sample beam is recombined with the locally circulated light, and the difference in phase between the two beams yields a precise measurement of the distance that the sample beam has traveled. However, interferometry requires careful synchronization of the two light beams — a difficult achievement for a moving vehicle.

“You could never put interferometry on a car because it’s so sensitive to vibrations,” researcher Achuta Kadambi said. “We’re using some ideas from interferometry and some of the ideas from lidar, and we’re really combining the two here.”

Kadambi said that the MIT system also borrows from principles of acoustics. For example, if two voices are singing at different pitches, the interplay between these tones will produce a third tone, with a frequency that is the difference between the first two.

The same is true with light pulses. If a ToF imaging system is firing light into a scene at the rate of a billion pulses a second, and the returning light is combined with light pulsing 999,999,999 times a second, the result will be a light signal pulsing once a second — a rate easily detectable with a commodity video camera. That slow “beat” will contain all the phase information necessary to gauge distance.

Rather than trying to synchronize two high-frequency light signals, as an interferometry system would do, the team modulated the returning signal using the same technology that produced it in the first place — in other words, they pulsed the already-pulsed light.

“The fusion of the optical coherence and electronic coherence is very unique,” said professor Ramesh Raskar. “We’re modulating the light at a few gigahertz, so it’s like turning a flashlight on and off millions of times per second. But we’re changing that electronically, not optically. The combination of the two is really where you get the power for this system.”

Gigahertz optical systems are better at compensating for fog than lower-frequency systems. Fog is problematic for ToF systems because it deflects the returning light signals so that they arrive late and at odd angles. Trying to isolate a true signal when light is scattered is too computationally challenging to do on the fly, said the researchers.

With low-frequency systems, scattering causes a slight shift in phase. But with high-frequency systems, the phase shift is much larger relative to the frequency of the signal. Scattered light signals arriving over different paths will actually cancel each other out — the troughs of one wave will align with the crests of another. According to the researchers, theoretical analyses performed at the University of Wisconsin and Columbia University suggest that this cancellation could be widespread enough to make identifying a true signal much easier.

To test their approach, the team sent a light signal through 500 meters of optical fiber before feeding the signal to the MIT-developed system. The fiber had regularly spaced filters along its length to simulate the power falloff incurred over longer distances.

Results from the team’s experiment showed depth sensing at 3-µm resolution and at Hertz-level framerates, and robustness to extreme vibrations. The team believes that these results could be used to develop applications for obtaining high-quality 3D scans in uncontrolled environments, and that the study could provide a foundation for future work in computational imaging, where cascaded modulating elements are incorporated into the correlation ToF architecture.

The research was published in IEEE Explore (doi: 10.1109/ACCESS.2017.2775138).

GLOSSARY
machine vision
Interpretation of an image of an object or scene through the use of optical noncontact sensing mechanisms for the purpose of obtaining information and/or controlling machines or processes.
time of flight
(TOF) The length of time needed for a signal to arrive at and be reflected from the target. The basis of an active autoranging/autofocus system.
computer graphics
Computer output in the form of pictorial representation (graphs, charts, drawings, etc.) that is displayed visually.
lidar
An acronym of light detection and ranging, describing systems that use a light beam in place of conventional microwave beams for atmospheric monitoring, tracking and detection functions. Ladar, an acronym of laser detection and ranging, uses laser light for detection of speed, altitude, direction and range; it is often called laser radar.
heterodyning
In optical communications, the translation of optical signals into radio signals, lowering their frequency in detection from greater than 1014 Hz to less than 1010 Hz, so that further signal processing can be done with conventional circuitry, yielding improved receiver sensitivity and selectivity.
optoelectronics
A sub-field of photonics that pertains to an electronic device that responds to optical power, emits or modifies optical radiation, or utilizes optical radiation for its internal operation. Any device that functions as an electrical-to-optical or optical-to-electrical transducer. Electro-optic often is used erroneously as a synonym.
Research & TechnologyeducationAmericasimagingmachine visiontime of flightTOFcamerascomputer graphicslidarinterferometersautomotivedefenseheterodyningoptoelectronics

Comments
Terms & Conditions Privacy Policy About Us Contact Us
back to top

Facebook Twitter Instagram LinkedIn YouTube RSS
©2018 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, info@photonics.com
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.