Search Menu
Photonics Media Photonics Buyers' Guide Photonics EDU Photonics Spectra BioPhotonics EuroPhotonics Industrial Photonics Photonics Showcase Photonics ProdSpec Photonics Handbook
More News
Email Facebook Twitter Google+ LinkedIn Comments

3-D View Has Neural Base
Apr 2008
ROCHESTER, N.Y., April 1, 2008 -- A small area of the brain that combines visual and nonvisual cues is behind our ability to perceive depth with one eye, a University of Rochester team has discovered.

"It looks as though in this area of the brain, the neurons are combining visual cues and nonvisual cues to come up with a unique way to determine depth," Greg DeAngelis, a professor in the Department of Brain and Cognitive Sciences at the university.

Humans and other animals are able to visually judge depth because we have two eyes and the brain compares the images from each. But we can also judge depth with only one eye, and scientists have been searching for how the brain accomplishes that feat. DeAngelis's team believes it has discovered the answer in a small part of the brain that processes both the images from a single eye and also the motion of our bodies.

DeAngelis said that means the brain uses a whole array of methods to gauge depth. In addition to two-eyed "binocular disparity," the brain makes use of other cues such as motion, perspective, and how objects pass in front of or behind each other to create a representation of the 3-D world in our minds.

The findings could eventually help instruct children who were born with misalignment of the eyes to restore more normal functions of binocular vision in the brain; it could also help construct more compelling virtual reality environments someday, since we have to know exactly how our brains construct 3-D percepts to make virtual reality as convincing as possible, DeAngelis said.

The new neural mechanism is based on the fact that objects at different distances move across our vision with different directions and speeds, due to a phenomenon called motion parallax, DeAngelis said in a statement.

"When staring at a fixed object, any motion we make will cause things nearer than the object to appear to move in the opposite direction, and more distant things to appear to move in the same direction. To figure out the real 3-D layout of a scene," DeAngelis said, "the brain needs one more piece of information, and it pulls in this information from the motion of the eyeball itself."

He said neurons in the middle temporal area of the brain are combining visual information and physical movement to extract depth information, and, the motion of near and far objects can be confused. But if the eye is moving while tracking the overall movement of the group of objects, it gives the middle temporal neurons enough information to grasp that objects moving across the scene in the same direction as the head must be far away, whereas objects moving in the opposite direction must be close by.

"We use binocular disparity, occlusion, perspective and our own motion all together to create a representation of the real, 3-D world in our minds," said DeAngelis.

The research was conducted in collaboration with Jacob W. Nadler and Dora E. Angelaki, at Washington University and was funded by the National Institutes of Health. The findings were published in the March 20 online issue of the journal Nature.

For more information, visit:

binocular vision
The ability of the two eyes to see an object from two slightly different points of view. This difference allows an individual to perceive the depth and dimension of the object in view.
depth perception
The direct appreciation of the distance between a given object and the observer, or between the front and back of a solid object. Real depth perception is achieved by the retinal disparity formed by the different viewing positions of each eye; in apparent depth, the disparity is formed synthetically; e.g., a stereogram.
The technology of generating and harnessing light and other forms of radiant energy whose quantum unit is the photon. The science includes light emission, transmission, deflection, amplification and detection by optical components and instruments, lasers and other light sources, fiber optics, electro-optical instrumentation, related hardware and electronics, and sophisticated systems. The range of applications of photonics extends from energy generation to detection to communications and...
binocular visionBiophotonicsbrain and cognitive sciencesdepth perceptionGreg DeAngelismotion parallaxNews & FeaturesphotonicsUniversity of Rochester

Terms & Conditions Privacy Policy About Us Contact Us
back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2019 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA,

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.