Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Army Robot Detects, Shares Environmental Changes with Human Teammate in Real Time

The robotic component of a human-robot team designed by the U.S. Army is capable of detecting physical changes in 3D and sharing the information it collects with a human in real time. Augmented reality enables the delivery of information, allowing the human recipient to assess the information and promptly determine action steps.

A team of scientists from the U.S. Army demonstrated the relationship in a structured real-world environment by pairing a small, autonomous, lidar-equipped mobile ground robot with a human teammate outfitted with AR glasses. As the robot patrolled its surroundings, it compared, in real time, its current and previous readings to detect changes. Any changes or perceived abnormalities are instantly displayed in the eyewear, allowing for human interpretation.

These include anything from camouflaged enemy soldiers to improvised explosive devices, said Christopher Reardon, a researcher in the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory.

The researchers tested with lidar sensors of varying resolutions on the robot to determine the best fit for their application. When the robot shared the measurements and changes that the different resolution sensors detected, the human teammate was able to interpret information from both the lower- and higher-resolution lidars. Depending on the size and magnitude of the changes with which the scientists tested, the system could use lighter, smaller, faster, and less-expensive sensors.

The capability could be incorporated into future mixed-reality interfaces, such as the Army’s Integrated Visual Augmentation System (IVAS) goggles. “Incoporating mixed reality into soldiers’ eye protection is inevitable,” Reardon said. “This research aims to fill gaps by incorporating useful information from robot teammates into the soldier-worn visual augmentation ecosystem while simultaneously making the robots better teammates to the soldier.”


The two robots used in the experiments are identically equipped, with the exception of Velodyne VLP-16 lidar (left) and Ouster OS1 lidar (right). Courtesy of the U.S. Army.
Real-world testing of the human-robot team is in contrast to much of the existing academic research in the use of mixed-reality interfaces for human-robot teaming, which relies on external instrumentation (typically in a lab) to manage the calculations necessary to share information between the teammates. Many engineering efforts to provide humans with mixed-reality interfaces do not examine teaming with autonomous mobile robots, Reardon said.

The research is part of the lab’s ongoing work that explores ways to provide contextual awareness to autonomous robotic ground platforms in maneuver and mobility scenarios. The work is part of the Artificial Intelligence for Mobility and Maneuver Essential Research Program.

Participating researchers also join international coalition partners in the Technical Cooperation Program’s Contested Urban Environment Strategic Challenge (TTCP CUESC) events to test and evaluate human-robot teaming technologies.

Future studies will explore how to strengthen teaming, with specific focus on increasing human interactivity with the robot’s detected changes. This will provide added information in real time about the context of the situation in which the robot is delivering information, and human versus natural environment changes, or false positives, Reardon said.

Changes and enhancements would also improve the autonomous context understanding and reasoning capabilities of the robotic platform, such as by enabling the robot to learn and predict the types of changes that constitute a threat.

The research entitled “Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection” was a collaboration between the U.S. Army and the University of California, San Diego. The collaborators published the research at the 12th International Conference on Virtual, Augmented, and Mixed Reality, part of the International Conference on Human-Computer Interaction.

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media