Search
Menu
Cognex Corp. - Smart Sensor 3-24 GIF LB

In a Disaster, Robots Go Where Humans Cannot

Facebook X LinkedIn Email
Sensors let robots assess a disaster scene and detect trapped people.

Hank Hogan, Contributing Editor

Disaster City, a 52-acre training facility in College Station, Texas, was the setting for a mock-disaster exercise in June. Train wrecks, collapsed buildings and piles of rubble with realistic signatures sprinkled throughout to help rescuers locate and identify the “victims” all had been planned as part of a drill to school emergency responders in how to use the latest robotics.

The exercise was the fourth sponsored by the Science and Technology Standards Program of the Department of Homeland Security (DHS). Another objective of these staged events — managed by the National Institute of Standards and Technology (NIST) in Gaithersburg, Md., for DHS — is to help NIST refine standards and devise tests to qualify rescue robots.

Robots_Fig2.jpg

Two robots investigate a simulated garage collapse. The larger is Mesa Robotics Inc.’s Matilda, and the smaller is Omnitech Robotics LLC’s Toughbot. Such robots could prove useful in rescue work, and training exercises such as this at Disaster City help prove the technology. Courtesy of the National Institute of Standards and Technology.


But these drills also have allowed robotics developers to test their latest designs, which often depend on sensor technologies. The technologies include cameras that capture and relay images of a scene, laser-based light detection and ranging (lidar) systems that map out object distances, and thermal imagers that identify the heat trace of someone trapped in a pile of rubble.

Robots_Fig1.jpg
Idaho National Laboratory fuses lidar data into a virtual representation ofthe local environment. Courtesy of Idaho National Laboratory.



The sensors can be mounted on unmanned ground-based or airborne robots or on vehicles to determine victim location and to perform infrastructure evaluation. For example, knowing which roads are open and which are blocked after a hurricane is important. In the case of buildings, assessing structural integrity can be vital.

NIST is still working out rescue robot standards. One evaluation tool will involve visual acuity by testing the ability of a remote operator to read something like an eye chart at various distances and illuminations. Elena Messina, a group leader at NIST, said that the technique is very close to being issued as a standard. However, she also noted that the institute must be careful in developing standards because producing one could be counterproductive until the technology becomes more stable.

Digging deep

Two examples of new technology were brought to the Disaster City drills by Satoshi Tadokoro, president of the International Rescue System Institute of Kawasaki and professor of information sciences at Tohoku University in Aoba-ku, both in Japan. He brought an active scope camera — an approximately 30-ft-long mechanical snakelike device that is tied to the operator. Developed at the institute and the university, it will, on command, vibrate its way into a small opening and relay back pictures. Tadokoro said that the device can reach farther into a pile of rubble than can ordinary videoscopes.

Robots_Fig3.jpg
Shown is Robotic FX Inc.’s Negotiator navigating around a Disaster City simulation of a train derailment, which in real life could pose a hazard from a chemical spill. Courtesy of the National Institute of Standards and Technology.

Meadowlark Optics - Building system MR 7/23


The other instrument was a lidar system in a cube that measures 4 in. on each side. Although small, it measured distances up to 27 feet, and the addition of a scanning mechanism enabled the construction of a three-dimensional map of a volume. Developed by Hokuyo Automatic Co. Ltd. of Osaka, Japan, the lidar system was compact enough to incorporate into a robot, Tadokoro said.

3-D mapping

Another sensor package came from Australia, courtesy of the Australian Research Council Centre of Excellence for Autonomous Systems at the University of New South Wales. Graduate student Raymond Sheh said that the group used a time-of-flight range sensor from CSEM of Neuchâtel, Switzerland, for three-dimensional mapping. The device determines the distance to each pixel by measuring the time it takes for light to be reflected from an object, thereby providing coordinates in three dimensions to each point.

The New South Wales group has in the past fused this data with that collected by color and thermal cameras to create a realistic image of the scene. The combined image helps with terrain sensing, an important capability for a rescue robot attempting to navigate around a pile of rubble.

The present sensor and setup work, but Sheh noted that they need improvement because rescue robots in the real world often have to operate in harsh environments. Ideally, the sensors would be hardened, capable of dealing with dust, shock and heat. They also should have a wide field of view. Finally, he said, they must be able to operate in sunlight.

Autonomous robots

David J. Bruemmer, technical director for unmanned ground vehicles at Idaho National Laboratory in Idaho Falls, did not have an entry in the latest Disaster City drills. However, he has been working on employing 3-D lidar systems in autonomous robots, the type of robot that does not need supervision to move from one spot to another. He noted that research has shown that autonomy brings benefits; for example, it frees the operator to concentrate on other things — such as spotting a victim and determining the best course of action.

Such autonomy requires sensing and mapping of the environment. When implementing such sensing with 2-D lidar systems, Bruemmer has encountered problems. Lasers that are small and low-powered enough to be suitable lack range, are sensitive to light and dust, and are not robust. Lasers without these drawbacks are too large and require too much power to be deployed in a robot. And no matter the laser power, such systems can have problems with certain objects, such as doorknobs, or with certain conditions, such as bright sunlight.

Bruemmer would like to have a laser rangefinder that goes out 10 m with a laser that is small and low-powered. So far, he has not found a device that meets these requirements. One that does, he predicted, could capture the market.

For now, he has been looking into another technology: stereovision, which requires only two cameras and a light source. It is not as accurate as lidar, and the scene must have features that allow the complete picture to be stitched together from separate images. But, he noted, stereovision technology does have some distinct advantages that would justify its use in rescue robots.

“Those systems are ready, very inexpensive, very light, very low power and very small,” he said.

Published: September 2007
defenseFeaturesNISTroboticsrobotsSensors & Detectors

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.