Close

Search

Search Menu
Photonics Media Photonics Marketplace Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

Bioinspired Sensor Uses Metalenses for Depth from Defocus

Facebook Twitter LinkedIn Email
CAMBRIDGE, Mass., Nov. 4, 2019 — Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed a compact sensor that can measure depth in a single shot. The sensor’s design was inspired by the specialized optics of the jumping spider, which has extraordinary depth perception. Each of the spider’s principal eyes has a few semi-transparent retinas arranged in layers. These retinas measure multiple images of prey with different amounts of blur to ascertain the distance between spider and prey. In computer vision, this type of distance calculation is known as depth from defocus.

So far, replicating nature has required large cameras with motorized internal components that can capture differently focused images over time. This has limited the speed and practical applications of a sensor based on depth from defocus.

The SEAS researchers combined multifunctional metalenses, nanophotonic components, and efficient computations to create a sensor that can efficiently measure depth from image defocus. The Capasso group had previously demonstrated metalenses that can simultaneously produce several images, each containing different information. Based on that research, the group designed a metalens that can simultaneously produce two images with different blur. “Instead of using layered retinas to capture multiple simultaneous images, as jumping spiders do, the metalens splits the light and forms two differently defocused images side-by-side on a photosensor,” researcher Zhujun Shi said.

Compact depth sensor using metalens and computational algorithms, Harvard University/SEAS.

The metalens depth sensor works in real time to capture the depth of translucent candle flames. The two images on the left are the raw images captured on the camera sensor. They are formed by the metalens and are blurred slightly differently. From these two images, the researchers compute the depth of the objects in real time. The image on the right shows the computed depth map.Courtesy of Qi Guo and Zhujun Shi/Harvard University.

The researchers coupled the metalens with off-the-shelf components to build a prototype sensor. The sensor’s current size is 4 × 4 × 10 cm, but since the metalens is only 3 mm in diameter, the overall size of the assembled sensor could be reduced with a purpose-built photosensor and housing. The researchers paired a 10-nm bandpass filter with the metalens, which is designed for monochromatic operation at 532 nm. A rectangular aperture was placed in front of the metalens to limit the field of view and prevent the two images from overlapping.

An algorithm developed by professor Todd Zickler’s group efficiently interprets the two images and builds a depth map to represent object distance.

To analyze the depth accuracy, the researchers measured the depths of test objects at a series of known distances and compared them with the true object distances. The 3-mm-diameter metalens measured depth over a 10-cm distance range, using fewer than 700 floating point operations per output pixel.

An illustration of a metalens designed for compact depth sensing. Harvard University/SEAS.
An illustration of a metalens designed for compact depth sensing. It consists of subwavelength-spaced square nanopillars. By alternating two different nanopillar patterns, visualized here in red and blue, this metalens forms two images at the same time. The two images mimic the images captured by the layered retinas in the eyes of jumping spiders. Courtesy of Qi Guo and Zhujun Shi/Harvard University.

The bioinspired design is lightweight and requires a small amount of computation compared with previous passive artificial depth sensors. The sensor’s small volume, weight, and computational requirements bring depth-sensing capabilities closer to being feasible on insect-scale platforms, such as microrobots, ingestible devices, far-flung sensor networks, and small wearable devices.

Researcher Qi Guo described the ability to design metasurfaces and computational algorithms together as “a new way of creating computational sensors.” The integration of nanophotonics and efficient computation could constitute a paradigm for design in computational sensing.

“Metalenses are a game-changing technology because of their ability to implement existing and new optical functions much more efficiently, faster, and with much less bulk and complexity than existing lenses,” professor Federico Capasso said. “Fusing breakthroughs in optical design and computational imaging has led us to this new depth camera that will open up a broad range of opportunities in science and technology.”

The research was published in the Proceedings of the National Academy of Sciences (https://doi.org/10.1073/pnas.1912154116).   



Inspired by jumping spiders, Harvard University researchers developed a compact and efficient depth sensor that could be used on microrobots, in small wearable devices, or in lightweight virtual and augmented reality headsets. The device combines a multifunctional, flat metalens with an ultra-efficient algorithm to measure depth in a single shot. Courtesy of Qi Guo and Zhujun Shi/Harvard University.
 

Photonics.com
Nov 2019
GLOSSARY
nanophotonics
The study of how light interacts with nanoscale objects and the technology of applying photons to the manipulation or sensing of nanoscale structures.
Research & TechnologyeducationAmericasthe Harvard John A. Paulson School of Engineering and Applied SciencesDepth Sensormetalensimaginglight sourcesopticsSensors & Detectorsbioinspired opticsroboticsaugmented realitynanonanophotonicslensesmetalens sensor

back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2023 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.