Search
Menu
Hamamatsu Corp. - Earth Innovations LB 2/24

Robot-human interactions improve with facial recognition

Facebook X LinkedIn Email
Compiled by Photonics Spectra staff

Scientists studying facial expressions are investigating a way to create socially aware companion robots and graphical characters that could recognize human facial movements.

Our brains pick up tiny, subtle clues about faces whenever we interact with other people, and now scientists from Queen Mary, University of London and University College London are investigating the possibilities of robots and computers picking up on such facial cues as well.

By breaking up facial movement into elementary facial actions and understanding how actions vary between people, computer scientists can analyze facial movement and build realistic motion into avatars, which could make them a more acceptable means of communication with people. This will become increasingly important as robots become more integrated into our daily lives; e.g., robotic aids have been incorporated into hospitals, the scientists say.


The “Emys” expressive robot head has been created to allow researchers on the LIREC project to explore social interactions between humans and machines. Courtesy of the LIREC consortium.


Using biology, the researchers hope to develop software that will enable robots to interact naturally with humans. They hope to train the robots to understand things such as personal space as well as how to pick up on and react to emotions.

“A picture of a face is just a frozen sample drawn from a highly dynamic sequence of movements,” said Alan Johnston, co-researcher and professor from the University College London division of psychology and language sciences. “Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans, as it allows experimenters to study facial motion in isolation from the form of the face.”

With great application potential, the new technology could be used to understand how people imitate facial expressions and why they are better at recognizing their own facial movements than those of the people with whom they are engaging in conversation.

The scientists presented their work from the European Union-funded “Living with robots and interactive companions” (LIREC) project at the annual Royal Society’s Summer Science Exhibition, held July 5-10, 2011.
CASTECH INC - High Precision CNC Polished Aspherical Lenses

Published: September 2011
Alan JohnstonavatarsEnglandEuropeEuropean Unionface perception recognitionfacial motionhuman facial movementImagingLIRECLiving with robots and interactive companionsQueen MaryResearch & Technologyrobotic facial expressionsrobots pick up on facial cuesRoyal Society Summer Science ExhibitionTech PulseUCLUniversity College LondonUniversity of London

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.