Search
Menu
Vescent Photonics LLC - Lasers, Combs, Controls 4/15-5/15 LB

Gigapixel “supercamera” delivers sharp shots

Facebook X LinkedIn Email
Ashley N. Paddock, [email protected]

The challenge to creating high-pixel imaging lies in the sophistication of the integrated circuits rather than the optics, and now scientists have overcome this hurdle by developing a supercamera that synchronizes 98 microcameras into a single device with the potential to stitch together images with a resolution of 50 gigapixels.

The Duke University camera, called AWARE 2, yields five times better resolution than that of 20/20 human vision over a 120° horizontal field, and it has the potential to capture up to 50 gigapixels, or 50,000 megapixels, of data. By comparison, consumer cameras can take photos with sizes ranging only from 8 to 40 megapixels.


A photograph of the Seattle skyline taken with AWARE 2 with image details as shown. The new camera can capture up to 50 gigapixels of data; today’s consumer cameras can achieve only 8- to 40-megapixel resolution. Images courtesy of Duke University Imaging and Spectroscopy Program.


“We built 300 microcameras in our first run and decided to build two AWARE 2 systems rather than one, which left us about 100 microcameras for each system with about 100 for testing and future development,” David J. Brady told Photonics Spectra. Brady is the Michael J. Fitzpatrick professor of engineering at Duke’s Pratt School of Engineering. “We will have the capacity to build one to ten gigapixel cameras per month starting this fall.”

With each containing a 14-megapixel sensor, the 98 tiny cameras yield nearly 100 separate but accurate images that a computer processor stitches together into a single highly detailed image. Each camera captures information from a specific area of the field of view, many times capturing images of things that photographers cannot see themselves but can detect when the image is later viewed, Brady said.


The AWARE 2 camera synchronizes 98 micro-cameras into a single device that stitches images together for higher resolution.


Traditionally, adding more glass elements to a device has yielded better optics because it increases complexity, said Michael Gehm, assistant professor of electrical and computer engineering at the University of Arizona in Tucson and the developer of the software that combines the input from the microcameras.

“This isn’t a problem just for imaging experts,” Gehm said. “Supercomputers face the same problem, with their ever-more-complicated processors, but at some point, the complexity just saturates and becomes cost-prohibitive.”

Optimax Systems, Inc. - Optical Components & Systems 2024 MR

Instead of making increasingly complex optics, Gehm said their approach offers a parallel array of electronic elements.

“A shared objective lens gathers light and routes it to the microcameras that surround it, just like a network computer hands out pieces to the individual workstations,” he said. “Each gets a different view and works on their little piece of the problem. We arrange for some overlap, so we don’t miss anything.”

For now, Brady said, his team will focus on the electronics and not try to increase the complexity of the optics because their new lens design approach was successful.

The prototype camera, which measures 2.5 x 2.5 ft and 20 in. deep, needs a lot of space to house and cool its electronic control boards; only about 3 percent of the camera is made of the optical elements. Because of this, the device is a long way from commercial availability.

Brady estimates that it will be five years before a more efficient, handheld consumer version of the technology is available for purchase.

“The optics is already small enough for handheld devices,” he said. “As electronics shrink, we anticipate building 100-megapixel to 500-megapixel handheld devices and one- to ten-gigapixel tripod-mounted systems. This technology will get into consumer devices, and consumers may hire photographers to record wall-size wedding photographs.”

Brady foresees many other consumer applications as well.

“I expect the first near-term application will be online broadcast of scenic sites, wildlife preserves and significant events,” he said. “These cameras enable interactive websites that become essentially 100- to 1000-channel broadcast centers. People will log in and track their favorite bird or animal, search for their favorite player, etcetera. These cameras will also be used for interactive telepresence – enabling, for example, overpowering gaming experiences (image real-time IMAX over the Grand Canyon).”

Although the research is supported by DARPA, Brady said the technology is more for the consumer than for defense purposes. “Ironically, in an age of constrained government resources, defense applications may develop more slowly than entertainment applications. Consumers will first experience the camera as a service, but, eventually, high-pixel-count cameras will be affordable to serious amateur photographers.”

Brady’s team is working to build prototypes of the AWARE 10, a series of 10-gigapixel cameras. The researchers plan to focus on strategies to reduce power requirements and increase frame rates once these systems come online, he said.

Their next generation of cameras will use color sensors, he added.

The research was published online in Nature (doi: 10.1038/nature11150).

Published: August 2012
Glossary
field of view
The field of view (FOV) refers to the extent of the observable world or the visible area that can be seen at any given moment through a device, such as an optical instrument, camera, or sensor. It is the angular or spatial extent of the observable environment as seen from a specific vantage point or through a particular instrument. Key points about the field of view include: Angular measurement: The field of view is often expressed in angular units, such as degrees, minutes, or radians. It...
photonics
The technology of generating and harnessing light and other forms of radiant energy whose quantum unit is the photon. The science includes light emission, transmission, deflection, amplification and detection by optical components and instruments, lasers and other light sources, fiber optics, electro-optical instrumentation, related hardware and electronics, and sophisticated systems. The range of applications of photonics extends from energy generation to detection to communications and...
50 gigapixel cameraAmericasArizonaAWARE 2camerasDARPADavid BradydefenseDuke UniversityField of Viewgigapixel camerahigh pixel imagingImagingintegrated circuitslensesMichael GehmmicrocameraNorth CarolinaOpticsphotonicsResearch & TechnologySensors & DetectorssupercamerasupercomputersTech PulseUniversity of Arizona

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.