Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Display Technologies Shape the Immersive Experience

BARRY YOUNG, OLED ASSOCIATION

Almost four years ago, Facebook’s Mark Zuckerberg plunked down a reported $3 billion for technolgy company Oculus VR and said, “We believe this kind of immersive, augmented reality [AR] will become a part of daily life for billions of people.” Last year, Zuckerberg testified in court regarding a lawsuit against Oculus that virtual reality (VR) hasn’t taken off as quickly as he had anticipated and that virtual reality sales “won’t be profitable for quite a while.”

VR isn’t as hot as everyone thought and the evidence is compelling. For most people, the cost is prohibitive: Headsets start at $400 to $600 and go up from there if motion controllers, eye trackers and other accessories are added. Plus, the headset is often tethered to a powerful computer that limits the user’s mobility and costs $500 to $1500.


Courtesy of Meta.


Sony’s new PlayStation VR, designed to work with the tens of millions of installed PlayStation 4 consoles, was expected to set the industry on its ear, but early enthusiasm quickly dissipated for both the games and content. Google’s well-publicized Glass and Glass 1 never reached the masses and are now relegated to industrial applications. The company recently slashed the price of its new Daydream View headset from $99 to $49 because it was disappointed with early usage numbers. Meanwhile, overall sales of VR headsets have been well below expectations. According to data compiled by market research firms, the total number of headsets sold in 2016 was 5.18 million, excluding Google Cardboard. And more than half were the low-cost Samsung Gear VR and Google Daydream View that use the physical display of a smartphone.

AR/VR re-energizing 3D

The AR/VR world is in a quandary. Companies have spent tens of billions on what was expected to be the next smartphone that, according to Zuckerberg, Apple’s Tim Cook and Microsoft’s Satya Nadella, would affect the daily lives of billions of people.



Courtesy of Statista/IDC.



While VR headsets seem to work in niche applications — education and training, for example — they haven’t fit the consumer model, as current headsets are bulky, expensive and constrain movement. Moreover, experts have confused consumers with terms such as virtual, merged and augmented reality, and immersive computing without clearly defining them.

The market research and analysis firm Gartner is more optimistic. It is forecasting that head-mounted displays, which had shipments of 16 million in 2016, will grow to 67 million by 2021. But VR has not reached the growth levels experienced by smartphones in the early years.


Courtesy of IDC/Canaccord Genuity.



AR/VR is expected to re-energize 3D after the failures of the movie industry. While Facebook, texting and FaceTime are 2D, VR/AR is a 3D space phenomenon. Web design must change if, instead of up/down and left/right scrolling, the actual web page takes up a 3D space. Instead of studying pictures of human anatomy, all parts of a human body — inside and out — would be available, including a more natural interface, as bodies physically operate in 3D. Some believe that AR/VR will not have arrived until mundane milestones are achieved. Industry is already adapting:

• The medical community has discovered that VR is more effective than morphine for pain management in the burn units, according to findings from the Human Photonics Laboratory at the University of Washington.

• Following a year of VR therapy, paraplegics participating in a study began to regain control of certain functions, such as bladder control.

• GE is training workers using Google Glass and AR technology.

LCDs vs. OLEDs

The challenges in reaching total immersive experiences are formidable, but the display industry appears to be up to the task. There are two display technologies vying for this market: LCDs and OLEDs, both in direct view and microdisplays, each with different characteristics.


Light-field headset with embedded LCD. Couresy of Stanford Labs.



OLEDs are the technology of choice for VR because of their advantages in latency, contrast ratio, response time and black levels, and they are used by all the popular VR headsets such as Vive, Rift or PlayStation VR. AR requires higher luminance due to the need to operate in any ambient condition, so liquid crystal on silicon (LCOS) has been used in Google Glass. The challenge is how to make displays that are comfortable near the eye, have a wide field of view, support 3D, and operate in any ambient environment, dark or bright. The use of micro-displays helps because of their smaller pixels and higher ppi, which eliminates the screen-door effect caused when ppi is incompatible with the distance between the eye and the display.


Image convergence is the process of integrating what the left and the right eye see into one image. Courtesy of Oculus.

Kopin, a world leader in LCOS display technology, announced the availability of a new OLED microdisplay. Kopin is well positioned, as it is the primary supplier of microdisplays for Google Glass and Google Glass 1.

In order to attract the consumer and reach billions of units/year, the product must be smaller and more like glasses than the current headset, and must achieve higher resolution — 3K — to eliminate the screen-door effect of current displays (see image, p. 35). The microdisplay must also support fast refresh rate and response time to eliminate any latency — 90 to 120 Hz with μs response — and operate untethered from a high-end PC. The smartphone would be a good substitute, but the graphics processor needs to reach the next level of performance to handle the bandwidth.


The display luminance would have to hit ~1000 nits to handle all of the artifacts even in a partially light-controlled environment. Motion controller, eye trackers and other accessories should all be built in. Batteries should be designed with flexible substrates and double the current capacity.


Courtesy of Samsung.

Hopewell, N.Y.-based eMagin is betting on OLED technology for tomorrow’s AR applications. Unlike LCOS, OLEDs can reach >2000 ppi. They support contrast levels, achieve faster response times and are transparent. They can also reach acceptable luminance levels. eMagin can achieve luminous levels of over 1000 nits because it is the first company that can pattern OLEDs at the micron level and, therefore, does not require a color filter, which blocks >50 percent of the light.


Kopin OLED display with pancake-thin optics next to Oculus.
 Courtesy of Kopin.

There are other solutions, such as light-field displays and holograms. A hologram is a photographic recording of a light field, rather than an image formed by a lens, and is used to display a fully 3D image of a “holographed” subject. However, in these designs, a basic imaging device such as an OLED or LCD is still necessary.


Screen-door effect showing 600 ppi (left) vs. 2000 ppi (right). Courtesy of eMagin.

There remains a chasm between where immersive technology is today and the experience-simulation systems depicted in “Inception” or “The Matrix.” To achieve true, reality-like immersive experiences, the human body must be modified, and computers connected directly to the human brain. It won’t happen until we can integrate prosthetic eyes/ears/limbs into the nervous system, which is still very much in the experimental stage. Devices and signal transceivers would have to be surgically implanted and attached to the brain. A responsive, Matrix-like simulation with the resolution or detail needed to completely fool the human brain is well beyond the reach of today’s hardware and software. In his book “The Singularity Is Near,” computer scientist and futurist Ray Kurzweil projects these capabilities around 2035.

Meet the author

Barry Young is CEO of the OLED Association, where he solves industrywide issues for the OLED display and lighting industries, promotes the technology, and maintains databases on OLED shipments, revenue and capacity. A founder of DisplaySearch, Young is a leading authority on OLEDs and flexible displays; email: beyoung2@gmail.com.



Virtual Reality (VR)

Virtual reality (VR) deals with the digital world. Everything seen on the headset is computer generated and the environment is totally controlled in terms of brightness and content. Ambient light is blocked and the device needs limited brightness, but requires high contrast, high pixel density and fast response to keep up with the changing images. A smaller form factor is also required.


Virtual reality (VR) headset and concept. Courtesy of Sony and Intel.

VR systems use the principle of stereoscopic vision to simulate the perception of depth and 3D structures. To achieve this, the system has to generate separate images for each eye, one slightly offset from the other, to simulate parallax.

• Display: The VR system renders a stereoscopic image on the head-mounted display with a minimum frame rate of 60 fps to avoid any perceived lag that might break the illusion or, worse, lead to the nausea that is often associated with poorly performing VR.

• Optics: Includes a pair of lenses that augments the eyes, allowing them to converge the images. The lenses converge and correct distortions so the brain perceives images with a sense of depth.

• Motion tracking: After the display, the next essential trick for making the brain believe it is in another place is to track movements of the head and update the rendered scene without any lag.



The VR Motion-To-Photon (MTP)

Motion-to-photon (MTP) latency must accept the input, calculate the app function and update the display within 16.67 ms to maintain 60 fps. Most believe the MTP needs to operate at 90 or 120 fps, so a display response time in the millisecond range is too slow.

Usually an array of sensors, such as gyroscopes and accelerometers, is used to track the movement of the user’s head. The information is passed to the computing platform to update the rendered images accordingly. Apart from head tracking, advanced VR systems aiming for better immersion also track the position of the user in the real world.

• Interaction devices/controllers: A VR system has to allow the user to interact with the virtual environment. It needs specialized input devices as simple as a magnetic button in Google Cardboard, or as advanced as hand- and body-tracking sensors that can recognize motion and gestures including Leap Motion, HTC Vive body-tracking system, etc. And finally, a computing platform does all the heavy lifting very fast. VR systems typically use either PC consoles or smartphones as the computing platform.


Augmented Reality (AR)

Augmented reality (AR) combines the real world and the virtual world in a seamless and interactive way. Ultimately, it could be a lifestyle product that is used more than a smartphone. Most daily interactions with people, products and brands might filter through one’s personal AR device.


Augmented reality (AR) concept and headset. Courtesy of Intel/Meta 2 and Intel.



Merged Reality (MR)

Merged reality (MR) is a term that can, for all intents and purposes, be interchanged with AR. The three basic components are similar to a VR system, including the head-mounted display, tracking system and mobile computer for the hardware. The goal of this new technology is to merge these three components into a highly portable unit much like a combination of a high-tech Walkman and an ordinary pair of eyeglasses.


A depiction of a merged reality (MR) experience (left) and overall concept (right). Courtesy of Intel/Magic Leap.



The head-mounted display enables the user to view superimposed graphics and system-created text or animation. Two head-mount design concepts are being researched: video see-through systems and optical see-through systems. The video see-through systems block out the user’s view of the outside environment and play the image in real time through a camera mounted on the headgear. Latency in image adjustment whenever the user moves his head is an issue. Optical see-through systems, on the other hand, make use of technology that “paints” the images directly onto the user’s retina through rapid movement of the light source. Cost is an issue, but researchers believe this approach is more portable and inconspicuous for future augmented reality systems.

• Tracking and orientation: The tracking and orientation system includes GPS to pinpoint the user’s location in reference to his surroundings and additionally tracks the user’s eye and head movements. The complicated procedure of tracking overall location, user movement and adjusting the displayed graphics needed are some of the major hurdles in developing this technology. The best systems developed still experience latency between the user’s movement and the display of the image.

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media