Search Menu
Photonics Media Photonics Buyers' Guide Photonics Spectra BioPhotonics EuroPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

Optimal Algorithm Developed for Determining Focus Error

Facebook Twitter LinkedIn Email Comments
AUSTIN, Texas, Sept. 27, 2011 — University of Texas researchers have discovered how to extract and use information in an individual image to determine how far objects are from the focus distance, a feat accomplished previously only by human and animal visual systems.

As with a camera, the human eye has an autofocusing system, but human autofocusing rarely makes mistakes. And unlike a camera, humans do not require trial and error to focus an object.

It is significant that a statistical algorithm can now determine focus error, which indicates how much a lens must be refocused to make the image sharp — from a single image — without trial and error, said Johannes Burge of the university’s Center for Perceptual Systems.

“Our research on defocus estimation could deepen our understanding of human depth perception,” Burge added. “Our results could also improve autofocusing in digital cameras. We used basic optical modeling and well-understood statistics to show that there is information lurking in images that cameras have yet to tap.”

The algorithm developed by Burge and his colleague, Wilson Geisler, can be applied to any blurry image to determine focus error. An estimate of focus error also makes it possible to determine how far objects are from the focus distance. They describe their work in an upcoming issue of Proceedings of the National Academy of Sciences.

In the human eye, inevitable defects in the lens, such as astigmatism, can help the visual system (via the retina and brain) compute focus error; the defects enrich the pattern of “defocus blur,” which is caused when a lens is focused at the wrong distance. Humans use defocus blur to both estimate depth and refocus their eyes. Many small animals use defocus as their primary depth cue.

“We are now one step closer to understanding how these feats are accomplished,” said Geisler, director of the Center for Perceptual Systems. “The pattern of blur introduced by focus errors, along with the statistical regularities of natural images, makes this possible.”

Burge and Geisler considered what happens to images as focus error increases: An increasing amount of detail is lost with larger errors. Then, they noted that, even though the content of images varies considerably (e.g., faces, mountains or flowers), the pattern and amount of detail in images is remarkably constant. This constancy makes it possible to determine the amount of defocus and, in turn, to refocus appropriately.

For more information, visit:
Sep 2011
depth perception
The direct appreciation of the distance between a given object and the observer, or between the front and back of a solid object. Real depth perception is achieved by the retinal disparity formed by the different viewing positions of each eye; in apparent depth, the disparity is formed synthetically; e.g., a stereogram.
AmericasautofocusingcamerasCenter for Perceptual Systemsdefocus blurdepth perceptiondigital camerasfocus errorhuman visual systemimagingJohannes Burgeoptical modelingProceedings of the National Academy of SciencesResearch & Technologystatistical algorithmsTexasUniversity of Texas AustinWilson Geisler

back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2020 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.