Search
Menu

Photonics Brightens the Future for Augmented Reality

Facebook X LinkedIn Email
More compact, efficient, and sophisticated light engines and waveguide designs are helping augmented reality become a material reality.

MICHAEL EISENSTEIN, CONTRIBUTING EDITOR

Out in the Gulf of Mexico, an offshore oil rig experiences a critical mechanical failure. The engineers with the expertise to fix the problem are back on the mainland, and flying them to the site on short notice would be costly and logistically difficult. But what if these experts could remotely guide the rig’s mechanical crew through the repairs without leaving home?

The newest generation of AR headsets from Magic Leap features segmented dimming, which provides a consistent, crisp image in both low-light and brightly lit environments. Courtesy of Magic Leap.


The newest generation of AR headsets from Magic Leap features segmented dimming, which provides a consistent, crisp image in both low-light and brightly lit environments. Courtesy of Magic Leap.

Much of the contemporary “metaverse” hype revolves specifically around virtual reality (VR), which fully immerses users in digitally generated worlds. But VR headsets are cumbersome, and most users find the experience disorienting — or even nauseating — over extended periods. Augmented reality (AR) takes a subtler approach, superimposing data and images onto a user’s real-world view. Such capabilities could make emergency repairs on an oil rig far more efficient.

“If someone’s already there and can wear transparent glasses, an expert can look through the crewman’s eyes and augment onto his vision what to repair,” said Nima Shams, vice president of product engineering and management at optics company DigiLens.

AR is already becoming commonplace in the military, and companies in this space are developing solutions that could transform the way many professionals work and collaborate.

Jonathan Gelberg, lead optics scientist at AR company Lumus, has worked with companies that are using AR to help firefighters rescue trapped people, assist architects with design projects, or even guide surgeons during procedures.

“There was a specific case where a doctor did a back surgery, and instead of having to look at the images of a spine on screens around him, the spine was superimposed on the actual back of the patient, and that actually decreased the duration of the surgery by half,” Gelberg said.

In the near future, AR systems could even become so lightweight and inobtrusive that they enable all-day wear and serve as a direct extension of smartphones, allowing users to keep their eyes up rather than buried in a screen. But the technology is still far from mature, and developers continue to grapple with the diverse technical challenges and trade-offs that must be addressed to create a comfortable and immersive AR experience.

Let there be light

The optical workings of an AR headset fundamentally boil down to two components: a light engine and a combiner. The primary visual output is generated by the light engine, which is equivalent to a tiny projector or display, and this is then converted into an image on the lenses of the unit by the combiner element. But a wide range of approaches can be taken to design these two elements.

Ultracompact light engines can now be incorporated into innocuous, lightweight AR glasses. Courtesy of TriLite.


Ultracompact light engines can now be incorporated into innocuous, lightweight AR glasses. Courtesy of TriLite.

Consider the light engine. At present, the most widely used design involves laser beam scanning, which is employed in AR headsets such as the Microsoft HoloLens 2. The principle is similar to an old-fashioned cathode-ray tube monitor — the display is “painted” by a laser beam that gets rapidly rastered via deflection by tiny mirrors. This approach offers excellent brightness and can also be extremely compact. TriLite Technologies has developed a streamlined laser beam scanner design, for example, that eliminates unnecessary optical elements, achieving a volume of just 1 cm3.

“This gives us the smallest, lightest, and also brightest — in the sense of efficiency — solution,” said Jörg Reitterer, TriLite’s co-founder and CTO. However, laser beam scanning is limited in the image quality that it can produce, with frame rates that typically range between 60 and 90 Hz.

“Sometimes you can see flickering,” said Shin-Tson Wu, an AR/VR researcher at CREOL, The College of Optics and Photonics at the University of Central Florida. “The resolution and sharpness of the laser beam scan compared to active matrix-based displays is much worse.”

But such displays have their own challenges.

For example, there is considerable excitement around micro-LED panels, which can potentially generate high- quality images without requiring a separate illumination element. However, assembling RGB micro-LED displays typically entails the laborious assembly of three separate single-color micro-LED microdisplays alongside optical elements that integrate the images. This can confound efforts to make a compact, high-resolution display.

A scanned laser beam generates a butterfly in an AR display. Courtesy of riLite.


A scanned laser beam generates a butterfly in an AR display. Courtesy of riLite.

“For micro-LEDs, as the pixel size decreases to, say, 2.5 µm, the efficiency drops,” Wu said. Alternately, it is possible to apply color filters to single-color white LED panels, but this, too, greatly reduces display efficiency.

Companies such as Nreal and Lenovo are already pushing forward with micro-LEDs, but others are waiting for the technology to mature.

“Eventually, they’re going to drop down to pixel sizes that we need and the efficiencies that we need to be successful at a large field of view,” said Kevin Curtis, vice president of optical engineering at Magic Leap. “Once you’ve done that, then you can really make some very small eyepieces.”

At present, the most popular alternative light engines are based on liquid crystal on silicon (LCOS), in which a laser or LED source is transmitted through and modulated by a liquid-crystal overlay. “We can achieve very, very high resolution with the current LCOS technology, as well as brightness,” said David Goldman, vice president of marketing at Lumus. The company can achieve pixel size on the order of 2 to 3 µm.

Field of view is gradually increasing with each generation of AR headset. Courtesy of Magic Leap.


Field of view is gradually increasing with each generation of AR headset. Courtesy of Magic Leap.

Until recently, size has been an impediment for LCOS. According to CREOL’s Wu, such light engines can be over 3× larger than a laser beam scanner or micro-LED system. But several companies are making headway on this front. He cited a front-lit LCOS design developed by a company called Himax Technologies that can be shrunk down to just 0.5 cm3.

Image management

The illumination generated by the light engine typically originates at the outer edges of the lenses on the AR headset and must subsequently be delivered to the wearer’s eyes. This is where the combiner element comes in.

Some AR systems, such as those from Nreal and Qualcomm, use an approach called “birdbath optics,” in which a curved mirror and beamsplitter directly relay the projected image from the light engine to the eye. This results in excellent efficiency. Most of the transmitted light is delivered, but the added optical elements make the eyepieces bulkier and obscure the wearer’s eyes from the outside world. In a 2021 analysis, AR/VR blogger Karl Guttag said the Nreal optics prevent 75% of the light that enters the lenses from exiting.

“It’s like you’re wearing sunglasses indoors all the time and nobody can see your eyes,” DigiLens’ Shams said. This may not be an issue for entertainment or media consumption, but it is problematic for settings in which AR use entails interacting with another person.

Some bulkier headset designs favor performance over comfort, but other AR manufacturers aim to produce lightweight glasses that can be worn all day. Courtesy of riLite.



Some bulkier headset designs favor performance over comfort, but other AR manufacturers aim to produce lightweight glasses that can be worn all day. Courtesy of riLite.

As an alternative, many current headsets use waveguide features on the lenses themselves. In this design, the light enters a transparent layer on the lens, where it is initially trapped by total internal reflection before being relayed back out to the wearer after it interacts with physical features incorporated into the waveguide.

“When we use the waveguide, we get some advantages in getting a larger eyebox,” Wu said, referring to the volume behind the lens within which the wearer can perceive AR content. A larger eyebox generally delivers a superior experience and accommodates a wider range of wearers with different head sizes and distances between their eyes.

Most of these waveguides employ diffractive gratings, which offer a well-established approach for steering light. But diffractive waveguides can also be inefficient and lossy — and they can be expensive to manufacture at scale. To address this, DigiLens devised a fabrication strategy that yields high-quality diffractive waveguides with greater options for customization, and at lower cost. Rather than using typical semiconductor foundry-style manufacturing, DigiLens uses a simple laser printing process to create Bragg structures on a polymeric layer sandwiched between two pieces of glass.

The Trixel 3 is the smallest AR light engine currently on the market, according its manufacturer, TriLite. Courtesy of TriLite.


The Trixel 3 is the smallest AR light engine currently on the market, according its manufacturer, TriLite. Courtesy of TriLite.

“The glass is fairly cheap, and the polymer is fractions of pennies because it’s such a small amount,” Shams said. “And the optical numbers that we’re producing are over 500 nits per lumen, which is among the most efficient.”

Lumus has developed a unique AR waveguide with a reflective design that incorporates tiny, faceted mirrors into the surface of the glass. They are angled and coated in such a way that the wearer does not notice them, and, at most angles, anyone viewing the wearer will not notice them either, Goldman said. This design yields very efficient light transmission and eliminates the discoloration of real-world objects that can potentially arise with diffractive waveguides — particularly when viewing pure white surfaces, which can produce a distracting rainbow effect. These reflective waveguides are more challenging to manufacture than their diffractive counterparts, but a partnership with German glassmaker SCHOTT has put Lumus on the road to mass production.

Some waveguides are more inefficient than others and have a tendency to leak visuals projected from the display outward rather than inward. This produces a disconcerting “eye glow” and can even pose a security risk if personal information is being viewed.

New waveguide designs aim to address this problem. DigiLens reports its waveguides exhibit less than 8% leakage, and Lumus claims losses of under 4% for its products.

In pursuit of the perfect package

The system design of a headset critically influences a host of other factors that will ultimately determine the quality of a user’s AR experience.

For example, brightness is heavily informed by the interplay between display and waveguide. A lossy waveguide will necessitate a more powerful light engine, while a more efficient waveguide will be more forgiving of a display’s shortcomings.

But brightness is not a one-size-fits-all proposition, especially if users intend to wear their AR glasses both indoors and outdoors. Wu said most headsets aim for a contrast ratio (that is, the luminance difference between the brightest and darkest elements in a display) of 5:1 in ambient light. But this cannot be resolved by simply pumping up the projector.

“If the ambient light is 3000 nits, then the display must have 15,000 nits, and that could be harmful to our eye,” Wu said.

One solution is to dynamically dim the lenses in response to external light.

Magic Leap 2 includes a particularly innovative solution to this problem. Its dimming system can operate both globally, to broadly block ambient light, and in selected regions of the display, where augmented content is being shown. This helps the display graphics to pop relative to their background, which results in more solid 3D graphics with reduced transparency.

Field of view (FOV) is another hurdle for AR designers, and companies are vying for bragging rights over how much visual real estate they can occupy with AR imagery. But the intended application matters here. The Magic Leap 2 headset is currently leading the pack with an FOV that spans 70° diagonally, which, according to Wu, approaches the theoretical maximum of 78° that can be achieved with waveguide-based systems.

The push for a broader FOV was motivated by user disappointment over first-generation headsets, which typically offered a FOV of 50°, Magic Leap’s Curtis said. “That really was not immersive, and we definitely heard that.” This drove the company to essentially reinvent their optics based on an innovative LCOS design. “That allowed us to make a very small eyepiece that supported a very large eyebox and a very large field of view,” he said.

Brightness is not a one-size-fits-all proposition, especially if users intend to wear their AR glasses both indoors and outdoors.
But for many applications, such as providing a head-up display or assisting wearers with focused tasks directly in front them, a narrower view may be sufficient. A prototype wearable by TriLite, based on the company’s latest-generation Trixel 3 light engine, fares well with a FOV of 30° for such informational purposes, according to Reitterer.

“It can also support more immersive types of display,” he said, noting that binocular systems can generate a richer experience by combining two overlapping images, even when each individual lens has a relatively narrow FOV.

There is also the challenge of shrinking everything down to fit into a compact and comfortable package.

Some headset designs take a bulkier, maximalist approach that results in high performance but also limits how long the device can be worn. “Our vision is to bring lightweight, all-day wearable glasses to everybody,” Reitterer said. The final product must offer style as well as substance. “It cannot end up looking like some tech demo or industrial type of device.”

Developing the lightweight glasses also creates burdens on the optical design and requires manufacturers to shrink the form factor of the light engine and combiner as much as possible without compromising the desired brightness, resolution, and FOV.

Light-efficient reflective waveguides can produce high-quality AR images, such as a virtual airship, even in broad daylight. Courtesy of Lumus.


Light-efficient reflective waveguides can produce high-quality AR images, such as a virtual airship, even in broad daylight. Courtesy of Lumus.

Finally, there is the matter of cost. Many leading commercial platforms, including HoloLens 2 and Magic Leap 2, target enterprise users, for whom a price point of several thousand dollars may not be problematic — especially in contrast to flying an engineer to an oil rig, for example, or transporting an expert surgeon across the country.

A recent analysis by Yole Développement forecast that the number of AR headsets sold would jump from 420,000 in 2020 to 64.5 million by 2027. Achieving broader mainstream success with the general public will likely require cost cuts. But given that many people will gladly shell out over $1000 for a cutting-edge smartphone, the greater pressure may be to develop a product that can deliver a superior experience within a similar price range for the consumer market.

“They’re used to the compute, the mobility, and the image quality — so the bar is set,” Shams said. “And for a technology that wants to become the next mobile computer, it must not compromise.”

Published: January 2023
Glossary
integrated photonics
Integrated photonics is a field of study and technology that involves the integration of optical components, such as lasers, modulators, detectors, and waveguides, on a single chip or substrate. The goal of integrated photonics is to miniaturize and consolidate optical elements in a manner similar to the integration of electronic components on a microchip in traditional integrated circuits. Key aspects of integrated photonics include: Miniaturization: Integrated photonics aims to...
augmented reality
Augmented reality (AR) is a technology that integrates digital information, typically in the form of computer-generated graphics, images, or data, with the real-world environment in real-time. AR enhances the user's perception of the physical world by overlaying or combining digital content onto the user's view of the real world, often through devices like smartphones, tablets, smart glasses, or specialized AR headsets. Key features and principles of augmented reality: Real-time...
FeaturesOpticswaveguidesmicro-LEDsDisplaysintegrated photonicsaugmented realityARconsumers

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.