Search
Menu

Smart Necklace Tracks Facial Expressions

Facebook X LinkedIn Email
ITHACA, N.Y., Aug. 10, 2021 — Cornell University researchers have developed a smart necklace wearable device capable of tracking facial expressions using light. The device, NeckFace, continuously tracks full facial expressions using infrared cameras to capture images of the chin and face from beneath the neck.

Cheng Zhang, assistant professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, led the work. NeckFace builds on a previous device developed by Zhang, C-Face, which worked in a similar fashion but was worn as a headset. The new device provides significant improvement, Zhang said, both in performance and privacy, and gives the wearer the option of a less-obtrusive neck-mounted device.
A close-up of the NeckFace wearable sensing technology hardware, which includes an infrared camera, a near-infrared LED and an IR narrow band-pass filter. Courtesy of Sci-Fi Lab.
A close-up of the NeckFace wearable sensing technology hardware, which includes an infrared camera, a near-infrared LED, and an IR narrow band-pass filter. Courtesy of SciFi Lab.

As well as potential emotion-tracking, Zhang envisions a number of applications for the technology: virtual conferencing when a front-facing camera is not an option, facial expression detection in virtual reality scenarios, and silent speech recognition.

“The ultimate goal is having the user be able to track their own behaviors, through continuous tracking of facial movements,” said Zhang, principal investigator at the Smart Computer Interface for Future Interactions (SciFi) Lab. “And this hopefully can tell us a lot of information about your physical activity and mental activities.”

On video conferencing, François Guimbretière, professor of information science in the Cornell Bowers College, said, “The user wouldn’t need to be careful to stay in the field of view of a camera. Instead, NeckFace can re-create the perfect headshot as we move around in a classroom, or even walk outside to share a walk with a distant friend.”


Zhang and his collaborators conducted a user study with 13 participants, each of whom was asked to perform eight facial expressions while sitting and eight more while walking. In the sitting scenarios, the participants were also asked to rotate the head while performing the facial expressions, and remove and remount the device in one session.

NeckFace was tested in two designs: a neckband draped around the back of the neck with twin cameras just below the collarbone level, and a necklace with a pendant-like infrared (IR) camera device hanging below the neck.

Baseline facial movement data was collected using the TrueDepth 3D camera on an iPhone X and compared to the data collected with NeckFace. Between the sitting, walking, and remounting expressions, study participants displayed a total of 52 facial shapes.

Using calculations involving deep learning, the group determined that NeckFace detected facial movement with close to the same accuracy as the direct measurements using the phone camera. The neckband proved more effective than the necklace, the researchers found, possibly because two cameras on the neckband could capture more information from both sides than could the center-mounted necklace camera.

Once the device is optimized, Zhang believes it could be particularly useful in the mental health realm for tracking people’s emotions over the course of a day. Although people don’t always wear their emotions on their face, he said, the amount of facial expression change over time could indicate emotional swings.

“Can we actually see how your emotion varies throughout a day?” he said. “With this technology we could have a database on how you’re doing physically and mentally throughout the day, and that means you could track your own behaviors. And also, a doctor could use the information to support a decision.”

The research was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (www.doi.org/10.1145/3463511).

Published: August 2021
Glossary
infrared
Infrared (IR) refers to the region of the electromagnetic spectrum with wavelengths longer than those of visible light, but shorter than those of microwaves. The infrared spectrum spans wavelengths roughly between 700 nanometers (nm) and 1 millimeter (mm). It is divided into three main subcategories: Near-infrared (NIR): Wavelengths from approximately 700 nm to 1.4 micrometers (µm). Near-infrared light is often used in telecommunications, as well as in various imaging and sensing...
near-infrared
The shortest wavelengths of the infrared region, nominally 0.75 to 3 µm.
tracking
1. The process of following an object's movement; accomplished by focusing a radar beam on the reticle of an optical system on the object and plotting its bearing and distance at specific intervals. 2. In display technology, use of a light pen to move an object across a display screen.
Research & TechnologyImagingSensors & Detectorscamerasinfrarednear-infraredexpression monitoringfacial expressiontrackingCornell UniversityCornell Ann S. Bowers College of Computing and Information ScienceSciFi LabCheng Zhang

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.