Search
Menu
Gentec Electro-Optics Inc   - Measure With Gentec Accuracy LB

Thermal Imaging Helps Nab Bombing Suspect

Facebook X LinkedIn Email
By Melinda Rose, Senior Editor

The photonic technology of thermal imaging was literally in the spotlight on the evening of April 19 as law enforcement officials used it to monitor and capture suspected Boston Marathon bomber Dzhokhar Tsarnaev.

On April 15, two bombs exploded in quick succession more than four hours into the marathon. Three people were killed and more than 140 were wounded. On April 19, Tsarnaev’s older brother, Tamerlan, also a suspect, was killed, and later that day, a massive manhunt for Dzhokhar shut down the city of Boston for the day.

A Massachusetts State Police helicopter equipped with a forward-looking infrared camera “picked up the heat signature of the individual, even though he was underneath what appeared to be the shrink-wrap, or cover, on the boat itself,” state police officer Timothy Alben told reporters shortly after the arrest. “The helicopter was able to direct the tactical teams over to that area and, ultimately, take him into custody.”

A forward-looking infrared camera (FLIR) on a Massachusetts State Police helicopter helped to find traces of Boston Marathon bombing suspect Dzhokhar Tsarnaev’s heat signature as he lay in tarp-covered boat.

A forward-looking infrared camera (FLIR) on a Massachusetts State Police helicopter helped to find traces of Boston Marathon bombing suspect Dzhokhar Tsarnaev’s heat signature as he lay in tarp-covered boat. Courtesy of Massachusetts State Police.


Of course, the use of thermal imagers on helicopters is not a new phenomenon, as no doubt plenty of marijuana growers can attest. Infrared imaging also is used routinely to detect heat loss in buildings to improve energy efficiency or by firefighters searching for people in thick smoke.

Optical imaging technologies enable the identification of suspects by their gait caught on camera and can calculate a suspect’s pulse to determine emotional state – a rising pulse rate indicates stress, as in anticipating a bold action.

Forensic teams in Boston might use optical techniques to check samples for explosives and other chemicals, whether at long distances or close up, said Christopher Carter, program manager at the Johns Hopkins University’s Applied Physics Lab in Laurel, Md. An example is Raman scattering, which provides a “fingerprint” of the molecules to be identified.

 The feet of Boston Marathon bombing suspect Dzhokhar Tsarnaev are clearly visible in this infrared imaging shot from a Massachusetts State Police video.

The feet of Boston Marathon bombing suspect Dzhokhar Tsarnaev are clearly visible in this infrared imaging shot from a Massachusetts State Police video. 


“Raman spectroscopy is being used by both military and civilian agencies to identify bulk chemicals,” Carter said. “You’d use a laser to look at the Raman scattered light, from a few centimeters away, and detect precisely what is there. Other active optical techniques can identify explosives from up to 100 meters away.”

Carter will chair sessions on advances in spectroscopic chemical detection at the SPIE Defense, Security and Sensing (DSS) conference next week in Baltimore. His sessions are part of the conference on Chemical, Biological, Radiological, Nuclear and Explosives (CBRNE) Sensing.

But while photonics is improving law enforcement efforts to identify perpetrators of heinous acts, challenges remain in detecting signs of such attacks before they occur.

The biggest challenges in standoff detection are extending the distance range at which effective identification can occur, improving signal detection over atmospheric and environmental noise and interference, and screening multiple in-motion threats.

Developments in optical standoff detection

A number of research groups around the world are working to improve methods to identify explosives, biological agents or other threats from longer and longer distances. These methods include ultraviolet laser-induced fluorescence, Raman- and quantum cascade laser-based detection, and advances to standoff trace chemical sensing. (For more on this topic, see Advances in Standoff Detection Make the World Safer from the April 2013 issue of Photonics Spectra.)

Earlier this month, we reported on a time-of-flight imaging system that can now gather high-resolution 3-D information up to 1 km away from an object that typically is very difficult to image.

Physicists at Heriot-Watt University in Edinburgh, Scotland, created a system that sweeps a low-power infrared laser beam rapidly over an object, recording, pixel-by-pixel, the round-trip flight time of the photons in the beam as they bounce off the object and arrive back at the source. Their system resolves depth on the millimeter scale over long distances using a detector that “counts” individual photons.

3-D images of two of the study's authors from Heriot-Watt University, taken in daylight from 910 m away.

3-D images of two of the study's authors from Heriot-Watt University, taken in daylight from 910 m away. Each standard photograph shows a close-up view of what the scanner sees. The middle-left panels show 3-D images with slightly less depth detail than the right-hand panels; this is because the detector spent more time collecting the returning photons for the images on the right than on the left. Courtesy of Optics Express.


Although other approaches have achieved exceptional depth resolution, the ability of the new system to image objects — such as clothing — that do not easily reflect laser pulses makes it useful for a wider range of field situations, said Dr. Aongus McCarthy, a research fellow at the university.

“Our approach gives a low-power route to the depth imaging of ordinary, small targets at very long range,” McCarthy said. Although it is possible for other depth-ranging techniques to match or outperform some characteristics of these measurements, “this single-photon-counting approach gives a unique trade-off between depth resolution, range, data-acquisition time and laser power levels.”

The primary use of the system will be scanning static man-made targets such as vehicles, the investigators say. With some modifications to the image-processing software, it also could determine a target’s speed and direction.

The research group of professor Manijeh Razeghi at Northwestern University’s McCormick School of Engineering and Applied Sciences is working on ways to improve standoff detection. In August 2012, we reported that her group had developed a resonator design that controls both wavelength and beam quality, enabling the purest, brightest and most powerful beams ever from a single-mode infrared quantum cascade laser. The work improves the beams’ accuracy, which is critical for boosting the standoff detection of gas, explosives or other hazardous materials to even greater distances. (See: Lasers for Standoff Sensing Improved)

Meadowlark Optics - Building system MR 7/23

Earlier this year, Razeghi’s group announced a new approach that integrates active and passive infrared imaging capabilities into a single chip for applications such as night-vision goggles and search-and-rescue operations. (See: Active and Passive Modes in One IR Camera)

I can see clearly now

As the dramatic nighttime shootout on April 18 with the Tsarnaev brothers demonstrated, it is critical for law enforcement officers to see in the dark as well as they possibly can to minimize threats. And not just at night, but also through smoke, fog, dust storms and other situations where visibility is poor. And to see clearly enough to determine whether someone is holding a bat or a rifle.

In his April 2013 Photonics Spectra feature, With Infrared, Military Owns More Than the Night, contributing editor Hank Hogan talks with Thomas Bowman, director of ground combat systems for the US Army’s Night Vision and Electronic Sensors Directorate, about how the military is trying to meet these goals — and to make systems less expensive, bulky, heavy and power-hungry — through sensor research and development.

Also included in the article are companies that make surveillance systems, such as Flir, HGH Infrared Systems, UTC Aerospace Systems and startup InView Technology, which is using a compressive sensing imaging approach developed at Rice University.

But it’s not always threats you need to see — there can be victims out there, too, who need rescue.

As we reported in February, researchers at Consiglio Nazionale delle Ricerche (CNR) in Rome used a lens-free technique to develop a system that can cope with the flood of radiation from an environment filled with flames and smoke.

Two images of a human subject as seen through flames.

Two images of a human subject as seen through flames. When viewed in infrared or white light, the man is almost completely occluded (left). The new system reproduces the image behind the flames using holography, revealing a man wearing a T-shirt and glasses (right). Courtesy of Optics Express.

The result is an IR camera that does not employ a lens, so collected light is distributed over the whole array of camera pixels, avoiding saturation and the blind spots it produces.

“It became clear to us that we had in our hands a technology that could be exploited by emergency responders and firefighters at a fire scene to see through smoke without being blinded by flames, a limitation of existing technology,” said Pietro Ferraro of CNR’s Istituto Nazionale di Ottica. “Perhaps most importantly, we demonstrated for the first time that a holographic recording of a live person can be achieved even while the body is moving.”

For more information on how infrared technology developed, see “The Technology of Night Vision” from the The Photonics Handbook.  

Photonics for defense and security

The Boston bombing will no doubt be on the minds of many who gather in Baltimore next week for SPIE DSS.

Stephen DelMarco, senior principal research engineer at BAE Systems, will be presenting developments in fusion techniques still in the research stage that will make storing and transmitting facial images much more efficient. His system uses multichannel image fusion for 3-D color face imagery, offering a big step up from previous matching systems.

“Now you can exploit multiple channels; — for example, red, green and blue color channels, along with depth information; — that give you better image recognition,” DelMarco said. “In the old days, you had a single channel, a gray-scale image to process, which provided less information.”

 This Wikipedia photo by hahatango shows the aftermath of the bombing in Boston.

This Wikipedia photo by hahatango shows the aftermath of the bombing in Boston.


One benefit will be image compression for transmission of images captured by color cameras over wireless devices or for storage on disk drives. The reduced size of compressed images offers the same image quality, but in a much smaller space, allowing transmission over limited-bandwidth channels and reducing transmission costs.

Emerging sensor needs and challenges that must be overcome to provide security in the 21st century will be the topic of a special session at SPIE DSS on Wednesday, May 1, at 3:30 p.m. The session, National Security Sensor Challenges, will discuss both long-term breakthrough sensing challenges as well as nearer-term cost, size and weight improvements to enable new mission capabilities. It will be moderated by Dr. David Whelan of Boeing Defense, Space and Security.

Another SPIE DSS panel discussion will be held on the challenges of processing big data, something the FBI and Boston Police know firsthand from the massive amount of digital footage of the marathon captured by news organizations, local security cameras, and the smartphones and cameras of onlookers and participants, that had to be checked for clues.

Moises Sudit of CUBRC Corp. in Buffalo, N.Y., will present a keynote on alternative strategies for dealing with big data. CUBRC executes research, development, testing and systems integration programs in areas such as information exploitation, chemical and biological defense, and public health and safety, according to its website.

Dr. Sos Agaian, Peter T. Flawn Professor of Electrical and Computer Engineering at the University of Texas at San Antonio, noted that investigators in Boston will benefit from the ability to integrate and enhance information from many kinds of mobile devices.

Boston Marathon bomb scene picture taken by investigators shows the remains of an explosive device.
Boston Marathon bomb scene picture taken by investigators shows the remains of an explosive device. Courtesy of the FBI.


“Visually, when you enhance the image, you can see things that at first you didn't see,” Agaian said. “If you have several cellphone images, you can integrate them all into a huge panorama of the scene of an event. And you can make faces and actions more recognizable.”

Agaian’s lab can calculate pulse rate by a facial recording with a camera and compare calculations over time to detect changes that actions do not show. When the heart starts to beat faster, he said, “it suggests something is going on that’s having an emotional effect.”

Agaian will chair the Mobile Multimedia/Image Processing, Security and Applications conference at DSS.

The search for better ways to detect explosive devices in advance of an event is ongoing, noted Dr. Augustus Way Fountain III, senior research scientist for chemistry in the Research and Technology Directorate, US Army Edgewood Chemical Biological Center, and chair of the CBRNE conference. He also serves as an at-large US representative to the NATO Sensors & Electronics Technology Panel, advising it on CBRNE detection.

“The Holy Grail of bulk or trace explosives detection remains elusive to both military and homeland defense agencies,” he said. “There are a number of research-and-development efforts across agencies working to detect persons or vehicles containing an explosive device before it is detonated; many of these rely on photonics-based technologies.”

For more information, visit: http://spie.org/defense-security-sensing.xml 

Published: April 2013
Glossary
forward-looking infrared
A night-vision device that uses one or more infrared transducers to scan a scene in the 3- to 5-µm or 8- to 12-µm spectral region, convert the infrared radiation to electronic data and present the resulting image on a televisionlike display. The term originally referred to airborne systems but now is used for any real-time thermal imaging system.
optical
Pertaining to optics and the phenomena of light.
photonics
The technology of generating and harnessing light and other forms of radiant energy whose quantum unit is the photon. The science includes light emission, transmission, deflection, amplification and detection by optical components and instruments, lasers and other light sources, fiber optics, electro-optical instrumentation, related hardware and electronics, and sophisticated systems. The range of applications of photonics extends from energy generation to detection to communications and...
thermal imaging
The process of producing a visible two-dimensional image of a scene that is dependent on differences in thermal or infrared radiation from the scene reaching the aperture of the imaging device.
Americaslensesinfrared camerasBasic ScienceBostonBoston MarathoncameraschemicalsCNRdefensedetectionDSSDzhokhar TsarnaevenergyEuropeFBIFlirforward-looking infraredImagingIRLaserslenslessLight Sourcesmanhuntmarathon bombingopticalOpticsOptics ExpressphotonicphotonicsResearch & TechnologyRice UniversityRomesecuritysensingSensors & DetectorsSPIESPIE Defense, Security & Sensing 2013standoff detectionTamerlan TsarnaevTest & MeasurementTexasthermal imaging

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.