Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Polarization Data Enhances 3D Scanning Resolution

The resolution of conventional 3D imaging devices can be increased by as much as 1,000 times by exploiting the polarization of light.

That's the conclusion of an international team of researchers who modified a Microsoft Kinect time-of-flight sensor with a polarizing filter. Their device, called Polarized 3D, yielded feature resolution in the hundreds of microns at a distance of several meters. On its own, the Kinect achieved centimeter-scale resolution.




By combining information from a Microsoft Kinect depth frame (a) with polarized photographs, MIT researchers reconstructed a 3D surface shown (c). Polarization cues can allow coarse depth sensors like Kinect to achieve laser scan quality (b). Courtesy of MIT.

The technique combines 3D imaging principles, which are good for determining distances, with polarization sensing, which can be used to determine orientation of objects.

"The work uses each principle to solve problems associated with the other principle," said Yoav Schechner, a professor at Technion, the Israel Institute of Technology. “Because this approach practically overcomes ambiguities in polarization-based shape sensing, it can lead to wider adoption of polarization in the toolkit of machine vision engineers."

Polarization affects the way in which light bounces off of physical objects. If light strikes an object squarely, much of it will be absorbed, but whatever reflects back will have the same mix of polarizations that the incoming light did. At wider angles of reflection, however, light within a certain range of polarization is more likely to be reflected.

This is why polarized sunglasses are good at cutting out glare: Light from the sun bouncing off asphalt or water at a low angle features an unusually high concentration of light with a particular polarization. So the polarization of reflected light carries information about the geometry of the objects it has struck.

This relationship has been known for centuries, but it's been hard to do anything with it because of a fundamental ambiguity about polarized light. Light with a particular polarization, reflecting off of a surface with a particular orientation and passing through a polarizing lens, is indistinguishable from light with the opposite polarization, reflecting off of a surface with the opposite orientation.

This means that for any surface in a visual scene, measurements based on polarized light offer two equally plausible hypotheses about its orientation. Canvassing all the possible combinations of either of the two orientations of every surface, in order to identify the one that makes the most sense geometrically, is a prohibitively time-consuming computation.

To resolve this ambiguity, the researchers used coarse depth estimates provided by time-of-flight measurements and other methods. Even with this added information, calculating surface orientation from measurements of polarized light is complicated, but it can be done in real time by a graphics processing unit.

The Polarized 3D system takes three photos of an object, rotating the polarizing filter each time. Algorithms compare the light intensities of the resulting images, producing images with resolution 1,000 times greater than the Kinect on its own could achieve. The system also performed better than a precision laser scanner, the researchers said.

The technique could lead to high-quality 3D cameras built into cellphones, and perhaps to the ability to snap a photo of an object and then use a 3D printer to produce a replica, the researchers said.

A mechanically rotated polarization filter would probably be impractical in a cellphone camera, but grids of tiny polarization filters that can overlay individual pixels in a light sensor are commercially available. Capturing three pixels' worth of light for each image pixel would reduce a cellphone camera's resolution, but no more than the color filters that existing cameras already use.

Farther out, the work could also aid in the development of driverless cars. Current autonomous vehicle systems based on laser scanning are reliable under normal illumination conditions, but don't perform as well in rain, snow or fog. That's because water particles in the air scatter light in unpredictable ways, making it much harder to interpret.

In some very simple test cases, Polarized 3D was able to use interference information to handle scattering.

"Mitigating scattering in controlled scenes is a small step," said Achuta Kadambi, a doctoral student at the Massachusetts Institute of Technology. "But that's something that I think will be a cool open problem."

The researchers will present their findings at the International Conference on Computer Vision later this month in Santiago, Chile.

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media