Search
Menu
PowerPhotonic Ltd. - Coherent Beam 4/24 LB

A better eye in the sky

Facebook X LinkedIn Email
For military imaging, a space-borne perch offers an overwhelming advantage, said John Silny of Raytheon Co. “You have the highest vantage point, the ultimate high ground, to view from.”

Imaging from orbit is changing, as can be seen in the Advanced Responsive Tactically Effective Military Imaging Spectrometer, or ARTEMIS, which was researched and developed by Raytheon’s El Segundo-based Space and Airborne Systems Div. Silny is technical director of the project, which currently is flying in a satellite roughly 200 miles above the Earth and which is in the process of moving into an operational phase.


An engineer adjusts the Advanced Responsive Tactically Effective Military Imaging Spectrometer, or ARTEMIS, hyperspectral imaging sensor prior to launch aboard the US Air Force’s TacSat-3 satellite. Courtesy of Raytheon Co.


Unlike other space-borne military sensors, ARTEMIS is a hyperspectral imager, capturing 400 bands in the visible through short-wave infrared. This creates a data cube, with ground location forming X and Y, while the spectral response is the third dimension.

This approach trades spatial resolution for spectral detail. Identification of objects is done by looking for a spectral fingerprint. Captured data is compared with results gathered in a lab, with corrections for atmospheric effects.


Shown below is an artist’s rendering of the TacSat-3 satellite, relaying hyperspectral imaging and other data to ground forces for tactical use. Courtesy of US Air Force.

The technique works, said Lesley Foster, program director for Tier 2 mission solutions at Raytheon. For example, ARTEMIS detected that a field thought to be natural grass was in fact covered by the artificial variety, determining this based upon its spectral signature.

ARTEMIS is intended to supply tactical information, such as the existence of camouflage, disturbed earth or other signs of enemy activity. To do this, it carries an onboard digital signal processor that allows it to deliver data, such as highlighted areas of suspicious activity, within 10 minutes of acquisition.

When, and if, the hyperspectral imager goes into operation, there may be some changes. For one thing, its field of view may be enlarged for a tenfold increase in coverage area. For another, it may be flown with other sensors on a constellation of satellites. Having multiple sensors, some operating at different wavelengths, will provide additional imaging opportunities at varying times and from different perspectives. Together, these can allow more information to be extracted.

This, of course, means that the separate data sets will have to be precisely stacked atop one another. That won’t be a problem because the image location on the ground is accurately known, Foster said. “As long as your geolocation accuracy is there, then you’ve got the knowledge you need to fuse the information.”

Building a better sight

Just as important as seeing from the high ground is seeing what’s right in front of you, particularly if an adversary can’t. That’s the motivation behind the development of the US Army’s next-generation night-vision goggles and weapon sights. It also drives efforts funded by DARPA.


A digital night-vision goggle, such as this BAE Systems-developed prototype, will fuse data from various thermal and low-light sensors to improve situational awareness. Courtesy of BAE Systems.


An example of a future goggle comes from global defense firm BAE Systems. The company’s prototypes digitally combine video imagery from a low-light-level visible sensor and an uncooled long-wave infrared sensor into a single color display mounted in front of the eye.

Fusing the output of two separate sensors offers advantages, said Scott Tarbox, the Lexington, Mass.-based manager of BAE’s Enhanced Night Vision Goggle program. Thermal imaging improves the ability to detect objects hotter than the background, such as people, and makes it possible to see through smoke and other obscurants. Low-light technology enables soldiers to see clearly at night, complementing the lower-resolution thermal image. Using a digital approach allows the signal to be processed for optimal image quality and eliminates the need to combine image streams optically.

Future designs could incorporate higher resolution sensors, but that enhancement must be weighed against size and weight limits. There also could be issues with other constraints, Tarbox said. “When you get larger resolution systems, there is a trade-off with the processing. The more processing you have to do, the more battery life you consume.”

Bristol Instruments, Inc. - 872 Series High-Res 4/24 MR

Prototypes of the new goggles will be tested this year. Production, which will come as a result of a government Request for Proposal, could start in 2013.

DARPA funds far-out, potentially very advantageous technology that will not go into production for years, if ever. One such program is the Super-Resolution Vision System, portions of which have reportedly been in field tests. The goal is to develop sighting systems that offer better than diffraction-limited imaging in a field device carried by a soldier.

One way this could be accomplished depends upon atmospheric microlensing. Although total images are blurred by turbulence, that same turbulence causes a varying set of individual pixels in every captured image to provide a sharp view. With high-speed imaging and enough processing power, those pixels can be picked out and assembled to yield a clearer image than is otherwise possible.

The hope is that this super vision technology will extend target identification distance, a potentially decisive advantage. Such a capability could reduce friendly-fire incidents and collateral damage.

The eye of the beholder

No matter how advanced the sensor technology, what is captured ultimately must go through the eye. Researchers are working on ways to present increasingly complex information without burdening soldiers unduly.

Nasser Peyghambarian, a professor of optics at the University of Arizona in Tucson, published results a few years ago (see “Coming Attractions: Holographic Movies,” Photonics Spectra, April 2008, p. 94) about holographic technology that allows scenes to be presented in three dimensions, without the need for special glasses. At the time, it took several minutes to record a scene.


Rewritable holograms, such as these, could allow three-dimensional rendering of battlefields and other military command-and-control applications. Courtesy of Nasser Peyghambarian group, University of Arizona.


That is too slow for an application such as the depiction of a battlefield or other command-and-control applications, Peyghambarian said. “Video rates may not be needed, but it needs to be pretty fast.”

He reported that his group had been working on the problem and should have new results out soon. Although he declined to give exact figures, he did say that these would show fast update times, perhaps fast enough to be useful.

Finally, there are microdisplays that use tiny chips near the eye to create a virtual image with the effect of a regular screen some distance away. In this way, a soldier can effectively carry a high-resolution screen in something the size of a postage stamp.


Microdisplays, which use a chip near the eye to create the virtual image of a much larger screen farther away, are used in military applications to fuse sensors and make the invisible visible. The technology is found in devices worn by soldiers (top right, bottom left) as well as in consumer products (camera and glasses, lower right). Courtesy of Kopin Corp.


Kopin Corp. of Taunton, Mass., is developing a 2048 x 2048-pixel device for the US military, said to be the world’s highest-resolution microdisplay. Antonio V. “Tony” Bacarella, director of business development for the company’s visual products group, noted that military applications demand higher imaging system performance than commercial ones. They typically require fewer cosmetic defects, higher contrast and more brightness uniformity.

They also must work in extreme cold. For that reason, Kopin uses three methods in its devices to maintain operation at very low temperatures. Two of the techniques, which were developed for commercial digital cameras, are integrated in the display backplane, allowing near-instant-on operation, despite the cold.

As can be seen, the company’s military and consumer products do share some technology. Another example is found in the high-resolution microdisplay under development. Its increase in pixel count has been accomplished by a decrease in pixel size, and those smaller pixels are showing up in some commercial products.

Bacarella declined to give specifics about the new technology but did offer a comparison to older products that shows where things are headed. In talking about pixel size, he said, “When you go back a couple of years, the standard was fifteen microns. Now you’re looking at pixels that are subnine microns.”

Published: April 2010
Glossary
hyperspectral imaging
Hyperspectral imaging is an advanced imaging technique that captures and processes information from across the electromagnetic spectrum. Unlike traditional imaging systems that record only a few spectral bands (such as red, green, and blue in visible light), hyperspectral imaging collects data in numerous contiguous bands, covering a wide range of wavelengths. This extended spectral coverage enables detailed analysis and characterization of materials based on their spectral signatures. Key...
ArtemiscamerasConsumerdefenseDisplaysFeatureshyperspectral imagingImagingmilitary imagingorbitsatelliteSensors & Detectorsspectroscopy

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.