Search
Menu
Photonics HandbookTechnology News

Sensor Uses Color Encoding for 3D Scene Reconstruction

Facebook X LinkedIn Email
Researchers at the National University of Singapore (NUS) developed a light-field sensor that detects 3D light fields across the x-ray to visible light spectrum. The sensor relies on a pixelated color conversion strategy that is based on perovskite nanocrystal arrays.

The color conversion strategy developed by the researchers further supported absolute spatial positioning, 3D imaging, and visible light and x-ray phase-contrast imaging in addition to 3D light-field detection.

Current light-field detection techniques either require complex microlens arrays or are limited to the ultraviolet (UV) to visible wavelength ranges. The ability to detect light direction beyond optical wavelengths, using color-contrast encoding, could be useful for bio-imaging, robotics, virtual reality, autonomous navigation, and other fields.

The researchers surmised that, based on the versatility of color encoding in data visualization, color-contrast encoding could be useful for visualizing the direction of light. The team used inorganic perovskite nanocrystals to build a systems to test this hypothesis; these nanocrystals are known to have strong optoelectronic properties.
A large-scale angle-sensing structure comprising nanocrystal phospors, a key component of the sensor, is illuminated under ultraviolet light. Three light-emitting phosphors that produce red, green, and blue light are arranged in a pattern to capture detailed angular information, which is then used for 3D image construction. The team is exploring the use of other materials for the structure. Courtesy of the National University of Singapore.
A large-scale angle-sensing structure comprising nanocrystal phospors, a key component of the sensor, is illuminated under ultraviolet light. Three light-emitting phosphors that produce red, green, and blue light are arranged in a pattern to capture detailed angular information, which is then used for 3D image construction. The team is exploring the use of other materials for the structure. Courtesy of the National University of Singapore.
The researchers combined pixelated perovskite nanocrystal arrays with a color CCD camera to demonstrate 3D object imaging and visible light and x-ray phase-contrast imaging. They patterned perovskite crystals onto a transparent thin-film substrate and integrated the perovskites into the CCD camera to create a crystal conversion system for the sensor. When incident light hit the light-field sensor, the nanocrystals became excited and emitted light in colors that varied depending on the angle of the incoming light. The CCD captured the color that was emitted, and the captured colors were then used for 3D image reconstruction.

These multicolor nanocrystal arrays enabled the researchers to convert light rays from specific directions into pixelated color outputs at a very high angular resolution of 0.0018°.


Still, researcher Yi Luying said, “A single angle value, however, is not enough to determine the absolute position of the object in a three-dimensional space.”

The researchers found that 3D light-field detection and spatial positioning of light sources could be achieved by modifying the specific orientations of the nanocrystal arrays.

“We discovered that adding another basic crystal converter unit, perpendicular to the first detector, and combining it with a designed optical system could provide even more spatial information regarding the object in question,” Luying said.

In proof-of-concept experiments, the light-field sensor captured 3D images of objects from 1.5 m away and accurately reconstructed the object’s depth and dimensions. The experiments also showed that the sensor could resolve very fine details; for example, it was able to create a precise image of a computer keyboard that captured the shallow protrusions of the individual keys. Applications for virtual reality, self-driving cars, and biological imaging, for example, all require precise 3D scene construction capabilities.

Also, the light-field sensor captured 3D images at higher depth resolution than other sensors, including microlens arrays. The NUS sensor has an angular measurement range of more than 80°, high angular resolution that could potentially be less than 0.015° for smaller sensors, and a spectral response range of 0.002 to 550 nm.

According to professor Xiaogang Liu, light-field detectors currently obtain multiple images of the same space from many different angles using an array of lenses, or photonic crystals. Integrating these elements into semiconductors is complicated and costly, Liu said. “Conventional technologies can detect light fields only in the ultraviolet to visible light wavelength range, leading to limited applicability in x-ray sensing,” he added.

Surgeons could use the new light-field sensor to accurately image a patient’s anatomy at varying depths. Self-driving cars could use the sensor to assess road hazards more accurately. Liu and his team are currently investigating possible ways to improve the spatial accuracy and resolution of the light-field sensor, such as the use of higher-end color detectors.

The team has applied for an international patent for the technology, and it plans to explore more advanced technologies to pattern perovskite crystals more densely onto the transparent substrate. According to Liu, this could improve spatial resolution, while the use of materials other than perovskite could also expand the sensor’s detection spectrum.

The research was published in Nature (www.doi.org/10.1038/s41586-023-05978-w).

Published: May 2023
Glossary
lens
A lens is a transparent optical device that focuses or diverges light, allowing it to pass through and form an image. Lenses are commonly used in optical systems, such as cameras, telescopes, microscopes, eyeglasses, and other vision-correcting devices. They are typically made of glass or other transparent materials with specific optical properties. There are two primary types of lenses: Convex lens (converging lens): This type of lens is thicker at the center than at the edges....
optoelectronics
Optoelectronics is a branch of electronics that focuses on the study and application of devices and systems that use light and its interactions with different materials. The term "optoelectronics" is a combination of "optics" and "electronics," reflecting the interdisciplinary nature of this field. Optoelectronic devices convert electrical signals into optical signals or vice versa, making them crucial in various technologies. Some key components and applications of optoelectronics include: ...
virtual reality
Virtual reality (VR) is a computer-generated simulation of a three-dimensional environment or experience that can be interacted with and explored by an individual using electronic devices, such as a headset with a display. VR aims to create a sense of presence, immersing users in a computer-generated world that can be entirely fictional or a replication of the real world. It often involves the use of specialized hardware and software to provide a fully immersive and interactive experience. ...
Research & TechnologyNational University of Singapore (NUS)Asia PacificeducationsensorsSensors & DetectorscamerasCCDlensimage sensorlight-fieldlight-field imagingangular measurementphase contrast imagingMaterialsperovskite nanocrystalsperovskitesoptoelectronicsvirtual reality3D imagingTechnology News

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.