Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Augmented Reality Takes Shape

VALERIE C. COFFEY, SCIENCE WRITER, stellaredit@gmail.com

The world of virtual reality (VR) has made a splash in the consumer market in recent years with its fully occluded VR headsets, such as the Oculus Rift and HTC Vive, that transport gamers to a virtual world. But an even bigger trend with more dollars invested in the past year is augmented reality (AR). Unlike the immersive experience of VR, AR incorporates wearable headgear to overlay information on the user’s field of view (Figure 1), adding layers of virtual content to the real world.


Figure 1.
This illustrated view shows how a user of AR smart glasses would see useful information overlaid on the scene of a warehouse. Courtesy of WaveOptics.

A wide variety of well-established optoelectronic technologies enables this AR magic: optical waveguides, prisms, motion detectors, touch displays, display engines with microprojectors, see-through optics, HD cameras, video optics and lenses, and optical sensors for facial recognition and depth. The current challenge is how to economically incorporate the optoelectronics into something functional, useful, power-conservative, and comfortable enough to use as headwear, not to mention make it fashionable.

The market has yet to see the first AR product reach consumer reality. But early developer efforts abound, and they’re getting closer to market. The list of top contenders pursuing consumer AR initiatives reads like the who’s who of tech heavyweights — Amazon, Apple, Facebook, Google, Intel, HTC, and Microsoft. In fact, hundreds of companies worldwide are involved in early-stage efforts to develop AR hardware and software — components, manufacturing tools, platforms, applications, and content — stimulating billions of dollars in investments for AR research and development in the past few years. Startup companies in AR/VR have raised over $3.6 billion in investments in just the last year, ending the first quarter of 2018, according to a new market research report — Augmented/Virtual Reality Report Q1 2018 — by investment firm Digi-Capital in Menlo Park, Calif.

The combined mobile AR and wearable AR markets could see 3.5 billion installed users, reaching $85 billion to $90 billion by 2022, the report estimates. The AR market is expected to become six times larger than VR in the next five years, with the ability to be applied far beyond gaming to a wide range of applications, from the factory floor to law enforcement and medicine. The report also concludes that AR will dominate VR for years to come.

The AR market has two main segments in development: mobile AR, consisting of apps that enable smartphone cameras to overlay objects and information on a phone display; and wearable AR, composed of headsets and smart glasses that layer AR onto the user’s real-world view.

In the mobile AR space, Apple’s ARKit and Google’s ARCore are platforms currently in development for smartphones, designed to ride the wave created by popular apps such as Pokémon Go and Snapchat filters. In February 2018, Google released an AR software developer’s toolkit on the Google Play Store for 13 types of Android smartphones, including Google’s Pixel phones and Samsung’s Galaxy S7, S7 Edge, S8, S8+, and Note8, among others. Developers can use this ARCore version 1.0 to start building and publishing apps and games for use on Android smartphones.

Other platforms in development in mobile AR include Facebook camera effects, which could extend in use to the massive mainstream user base of Messenger, WhatsApp, and Instagram. Snap Lens Studio is another, intended for eventual incorporation into Snapchat and Bitmoji.

Heads up

The AR wearables segment of headsets and smart glasses offers varying levels of complexity and features, from a simple step-count readout and clock (such as a Fitbit in glasses) to a full-on computer and semi-immersive display. Many tech giants are working on headsets or smart glasses that will operate in concert with AR mobile platforms already in development.

The AR market is expected to become six times larger than VR in the next five years, with applications beyond gaming, from the factory floor to law enforcement and medicine.
this sounds familiar, you may recall the hoopla of Google Glass. If not, it may be because this Google version of the head-mounted wearable computer disappeared. At $1500, Google Glass was an expensive and buggy foray into the first generation of smart glasses. Citing issues with safety, health, and privacy rights, such as surreptitious recording of conversations, Google pulled the plug in 2015 on the original Google Glass project, which never really made it out of beta testing into consumer applications. However, the second-generation Enterprise Edition of the device has found a home in niche enterprise applications such as factories and warehouses.

Hoping to follow Google and the Enterprise Edition of Glass, other developers are focused on perfecting their features in specialized applications before leaping into the higher-volume consumer space. Improvements will aim to strengthen the functional operation between the eyewear and the computing platform (whether computer or smartphone) and make it a practical tool for managing tasks in real time. The upcoming iterations of smart glasses will also focus on style, appearing more fashionable than Glass.


Figure 2.
In a joint venture between Carl Zeiss AG and Deutsch Telekom AG, Tooz Technologies is developing a prototype for hands-free smart glasses. The design incorporates microdisplays that project information to the eye of the user combined with cloud-computing capabilities. Courtesy of Zeiss.

In February, German optics producer Carl Zeiss AG announced a joint venture with global telecommunications giant Deutsche Telekom AG to develop smart glasses. The partnership establishes a new company, Tooz Technologies, to be based in the U.S. and Germany, which will integrate Zeiss’ optical technology with delay-free mobile connectivity provided by Telekom. The resulting smart glasses (Figure 2) will use cloud connectivity for high-performance computing. This will enable smaller, lighter, cooler, and power-conservative eyewear that offers hands-free functions in applications that include logistics, surgery, and personal shopping. Eventually, consumer-targeted access to mobile functions such as navigation and social media also will be available.

Numerous other companies pursuing smart glasses development include Magic Leap, Meta, ODG, and Vuzix. Microsoft’s HoloLens is an example of a head-mounted AR display of holographic images tethered to a computer that fits in your pocket. For now, these types of smart glasses are, at best, available as a development kit, and, similar to many other smart glasses, are likely destined for industrial apps first rather than consumer ones.


Figure 3.
The Magic Leap One AR headset projects holographic virtual images and audio to rounded lenses from an attached computer called Lightpack, about the size of a CD player. The platform interface uses voice, gesture, head pose, and eye tracking. Courtesy of Magic Leap.

In just the past year, Florida-based Magic Leap alone raised nearly $1 billion in funding. The secretive startup is building smart glasses based on “dynamic light-field technology” — effectively a projector and eyepiece imprinted with a special grating waveguide tethered to a pocket-size computer, which layers 3D images on real-world scenes (Figure 3). In addition to semi-immersive AR gaming uses, one recent patent revealed plans for a sign-language text translation app, in which both speech and sign language can be translated into text between a deaf/hard-of-hearing user and a nondeaf speaker. Google is now invested heavily in Magic Leap, aiming to differentiate from the flop of Google Glass as well as from new competition. The One Creator Edition is scheduled to launch this year to software developers.

Differentiating factors

An overlay of Pokémon characters on a live smartphone or tablet camera field is one thing. Projecting an image onto human eyes to display information over an actual field of vision provides a more immersive experience that requires a headset or pair of smart glasses. Where glasses are concerned, field of view (FOV) is an important distinguishing factor between an engaging app and a truly immersive AR experience. The wider the FOV, the more realistic the experience. AR companies are competing to develop a way to convert a small input into a large, high-quality projection on the widest FOV possible in a head-mounted display.

Eye-box is a key parameter for any AR display. A large eye-box allows for variations in viewing position without vignetting (cropping) the image. WaveOptics, an Oxford, England-based designer and manufacturer of diffractive AR waveguides, has developed a waveguide solution with a 40° FOV across an eye-box display measuring 19 × 15 mm at 25-mm eye relief, called Phlox. This wide FOV results in a more usable, comfortable display that fits 95 percent of adult face shapes without cumbersome calibration for every user.

In January, WaveOptics announced the availability of AR modules combining its Phlox waveguide with a light engine and driver from DLP projector company Coretronic Corp. in Taiwan, mounted on a lightweight frame (Figure 4). The modules can be purchased as a base for other AR smart glasses developers, saving time to market. The partners hope to launch commercial AR end-user products for under $600 by 2019.


Figure 4.
The Phlox transparent optical waveguide transmits light via total internal reflection toward the eye in a process known as 2D pupil expansion. This enables a small light engine to support a large, full-color display for a lightweight AR device with flexible eye separation distances. Courtesy of WaveOptics.

According to David Grey, co-founder of WaveOptics, another key challenge to consumer AR smart glasses is in volume manufacturing.

“We can use standard fabrication processes from the optics and electronics industries to manufacture our waveguides, which is critical for cost-effective mass-manufacturing of AR smart glasses for consumers,” he said.

A long road to adoption

Adoption may take time because mobile AR is still in its very early stages. Analysts at Digi-Capital point to the analogy that Uber didn’t take off until two years after the iPhone launched. AR developers must learn what works to get consumers and businesses to adopt mobile AR in volume, much less AR wearables. Digi-Capital forecasts that even with 900 million installed bases for ARKit and ARCore by the end of 2018, AR/VR revenue will not begin to scale until 2019.

“This technology is promising, causing many major players to work on smart glasses and AR solutions,” said Kai Stroeder, managing director at Tooz Technologies. “In the end, users will decide if and how smart glasses are ‘the next big thing.’”

Researchers are urgently pursuing cutting-edge technology that will comprise the next-generation of AR beckoning over the horizon. At the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), a new type of optical lens has potential in AR and dozens of other optical applications. Federico Capasso, the Robert L. Wallace Professor of Applied Physics, and Vinton Hayes, senior research fellow in electrical engineering at SEAS, and colleagues are working on developing an adaptive meta-lens — a flat, flexible, tunable lens made of silicon nanostructures (Figure 5). The metalens measures only 30 µm thick and 100 µm across — equal to the size of a single piece of glitter. When attached to a sheet of elastomer controlled by a voltage, the lens stretches laterally to dynamically autofocus, zoom, and correct for aberrations such as astigmatism and image shift in real time. This makes it more effective than the human eye and far smaller.


Figure 5.
A prototype artificial eye positions a metalens on an elastomer sheet that enables applied voltage to control focus, zoom, and astigmatism, with potential in future AR/VR systems. Courtesy of Harvard SEAS.

The artificial eye, or metalens, controls the wavelength of light mapped into the outgoing optical wavefront to achieve large diffraction-limited, focal-length tuning and control of aberrations. Effectively, the device could replace comparatively bulky optical lenses in smartphones and chip-scale image sensors, and equip smart glasses with AR/VR capabilities.

According to metalens researcher Alan She, “The adaptive metalens is commercializable with exciting applications in many consumer-based devices, from smartphone cameras to wearable displays and microscopes with real-time aberration control”1.

In 2017, Capasso founded a spinoff company — Metalenz — to develop the adaptive optical system, which has obtained venture capital seed funding and is working on round B. Challenges include reducing the required voltage from kV to hundreds of volts and establishing a cost-effective reproducible fabrication process.

“We may see stretchable metalenses become an industrial reality in perhaps the next five years,” Capasso said. “The future looks bright.”

References

1. A. She et al. (2018). Adaptive metalenses with simultaneous electrical control of focal length, astigmatism, and shift. Science Advances, Vol. 4, Issue 2, doi:10.1126/sciadv.aap9957.

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media