A super multiview technique has produced a 3D display that reduces the physical viewer discomfort that currently limits the widespread application of the technology. The technique greatly decreases the required number of microdisplays, meaning it could even be integrated with portable and wearable devices. “There are many causes for 3D-viewing discomfort, but the most substantial one is the vergence-accomodation conflict,” said professor Lilin Liu of Sun Yat-Sen University. Vergence-accomodation conflict is a mismatch between the point at which the eyes converge on an image and the distance to which they focus when viewing 3D images, Liu said. Human eyes are separated by about six centimeters, which means that when viewing an object, the two eyes see slightly different images. The brain directs both eyes to the same object, and the distance at which the eyes’ sight lines cross is the vergence distance. Meanwhile the brain adjusts the focus of the lens within each eye to make the image sharp and clear on the retina. The distance to which the eye is focused is referred to as the accommodative distance. Failure to converge leads to double images, while misaccommodation results in blurry images. In natural viewing, human’s vergence and accommodation responses are correlated with each other and adjust simultaneously. In other words, vergence and accommodation distance are almost always the same, enabling a clear and comfortable object view. Conventional 3D displays try to mimic the natural viewing by creating images with varying binocular difference, which simulates vergence changes in the natural 3D landscape. But the accommodative distance remains unchanged at the display distance, resulting in the so-called vergence-accomodation conflict that causes viewer discomfort, the researchers said. “Conventional 3D displays usually deliver some views of the displayed spatial spot to a single eye pupil. That is why accommodative distance remains fixed on the display screen and cannot adjust simultaneously as vergence distance does, causing vergence-accomodation conflict,” said Liu. The team’s solution was to project numerous 2D perspective views to viewpoints with intervals smaller than the pupil diameter of the eye, allowing the device to deliver at least two different views to a single eye pupil. “Our proposed scheme overcomes vergence-accomodation conflict by delivering more than two views to a single eye pupil, making the eyes focus on the displayed image naturally,” said researcher Dongdong Teng. “Also, the prototype in our study is 65-mm-thin, and the system could become thinner with improvement in structural elements, which provides a demo for comfortable 3D wearable electronics or portable displays.” The team’s prototype system consisted of 11 elementary projecting units, each comprising an organic LED (OLED) microdisplay, a rectangular projecting lens, two vertical baffles and a group of gating apertures (liquid crystal panel) attached to the projecting lens. By gating different gating apertures in sequence and refreshing the virtual image of the corresponding microdisplay synchronously, the researchers obtained dense viewpoints on the display screen, which Liu said was the key to a comfortable 3D effect. To test viewers’ reactions to the prototype system, eight subjects were asked to observe a displayed 3D image of an apple in the lab environment: No headache or discomfort was reported. Because the gating aperture array was adhered to the projecting unit array, the prototype structure was thin — around 65 mm. Liu said other adjustments to the device could make it even thinner, which will be a focus of the team’s future work. The research was published in Optics Express, a publication of The Optical Society (OSA) (doi: 10.1364/oe.24.004421).