Search Menu
Photonics Media Photonics Buyers' Guide Photonics Spectra BioPhotonics EuroPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook
More News

UK Agency Funds Active Vision Research

Facebook Twitter LinkedIn Email Comments
Daniel S. Burgess

The UK Engineering and Physical Sciences Research Council has awarded a three-year grant valued at approximately $470,000 to scientists at Oxford University in the UK for the further development of a dynamic approach to simultaneous localization and map building that employs active vision in a single handheld camera. The project expands on a prior grant from the agency, which resulted in the development of a proof-of-concept prototype.

UK Agency Funds Active Vision Research
The UK Engineering and Physical Sciences Research Council is funding efforts at Oxford University to develop a single-camera simultaneous localization and mapping system. The work has applications in robotics as well as in the fusion of live video with computer-generated effects, such as in this demonstration, in which virtual shelving and a virtual table have been added to video of a real kitchen. Courtesy of Andrew J. Davison.

Simultaneous localization and map building refers to a strategy in robotics wherein a system produces a representation of the surrounding environment that also establishes the position of the system within it. Typically, this has involved the use of 2-D mapping strategies employing sonar or laser rangefinders.

The Oxford scientists instead are investigating an active vision approach, in which a camera system selects persistent, relatively large features in its 3-D surroundings as landmarks that are continually evaluated with regard to their utility for navigation. To establish the depth of landmarks in the visual field, the system produces depth hypotheses based on an assumption of the dimensions of the space that are weighed against other hypotheses at subsequent time steps as the camera moves. With a few such comparisons, the distribution of probable depths can be approximated as Gaussian, and the landmark is initialized as a point in the 3-D map.

Beyond navigation in robotics, the work has potential applications in broadcast entertainment. With such a real-time understanding of the position of a camera, computer-generated image effects could be integrated with a live video feed to create, for example, virtual sets.

Photonics Spectra
Mar 2005
Businessenergylight speedOxford Universitysingle handheld cameraUK Engineering and Physical Sciences Research Council

back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2020 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA,

Photonics Media, Laurin Publishing
x Subscribe to Photonics Spectra magazine - FREE!
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.