Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Robot-human interactions improve with facial recognition

Compiled by Photonics Spectra staff

Scientists studying facial expressions are investigating a way to create socially aware companion robots and graphical characters that could recognize human facial movements.

Our brains pick up tiny, subtle clues about faces whenever we interact with other people, and now scientists from Queen Mary, University of London and University College London are investigating the possibilities of robots and computers picking up on such facial cues as well.

By breaking up facial movement into elementary facial actions and understanding how actions vary between people, computer scientists can analyze facial movement and build realistic motion into avatars, which could make them a more acceptable means of communication with people. This will become increasingly important as robots become more integrated into our daily lives; e.g., robotic aids have been incorporated into hospitals, the scientists say.


The “Emys” expressive robot head has been created to allow researchers on the LIREC project to explore social interactions between humans and machines. Courtesy of the LIREC consortium.


Using biology, the researchers hope to develop software that will enable robots to interact naturally with humans. They hope to train the robots to understand things such as personal space as well as how to pick up on and react to emotions.

“A picture of a face is just a frozen sample drawn from a highly dynamic sequence of movements,” said Alan Johnston, co-researcher and professor from the University College London division of psychology and language sciences. “Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans, as it allows experimenters to study facial motion in isolation from the form of the face.”

With great application potential, the new technology could be used to understand how people imitate facial expressions and why they are better at recognizing their own facial movements than those of the people with whom they are engaging in conversation.

The scientists presented their work from the European Union-funded “Living with robots and interactive companions” (LIREC) project at the annual Royal Society’s Summer Science Exhibition, held July 5-10, 2011.

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media