Search Menu
Photonics Media Photonics Marketplace Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

Smart Necklace Tracks Facial Expressions

Facebook Twitter LinkedIn Email
ITHACA, N.Y., Aug. 10, 2021 — Cornell University researchers have developed a smart necklace wearable device capable of tracking facial expressions using light. The device, NeckFace, continuously tracks full facial expressions using infrared cameras to capture images of the chin and face from beneath the neck.

Cheng Zhang, assistant professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, led the work. NeckFace builds on a previous device developed by Zhang, C-Face, which worked in a similar fashion but was worn as a headset. The new device provides significant improvement, Zhang said, both in performance and privacy, and gives the wearer the option of a less-obtrusive neck-mounted device.
A close-up of the NeckFace wearable sensing technology hardware, which includes an infrared camera, a near-infrared LED and an IR narrow band-pass filter. Courtesy of Sci-Fi Lab.
A close-up of the NeckFace wearable sensing technology hardware, which includes an infrared camera, a near-infrared LED, and an IR narrow band-pass filter. Courtesy of SciFi Lab.

As well as potential emotion-tracking, Zhang envisions a number of applications for the technology: virtual conferencing when a front-facing camera is not an option, facial expression detection in virtual reality scenarios, and silent speech recognition.

“The ultimate goal is having the user be able to track their own behaviors, through continuous tracking of facial movements,” said Zhang, principal investigator at the Smart Computer Interface for Future Interactions (SciFi) Lab. “And this hopefully can tell us a lot of information about your physical activity and mental activities.”

On video conferencing, François Guimbretière, professor of information science in the Cornell Bowers College, said, “The user wouldn’t need to be careful to stay in the field of view of a camera. Instead, NeckFace can re-create the perfect headshot as we move around in a classroom, or even walk outside to share a walk with a distant friend.”

Zhang and his collaborators conducted a user study with 13 participants, each of whom was asked to perform eight facial expressions while sitting and eight more while walking. In the sitting scenarios, the participants were also asked to rotate the head while performing the facial expressions, and remove and remount the device in one session.

NeckFace was tested in two designs: a neckband draped around the back of the neck with twin cameras just below the collarbone level, and a necklace with a pendant-like infrared (IR) camera device hanging below the neck.

Baseline facial movement data was collected using the TrueDepth 3D camera on an iPhone X and compared to the data collected with NeckFace. Between the sitting, walking, and remounting expressions, study participants displayed a total of 52 facial shapes.

Using calculations involving deep learning, the group determined that NeckFace detected facial movement with close to the same accuracy as the direct measurements using the phone camera. The neckband proved more effective than the necklace, the researchers found, possibly because two cameras on the neckband could capture more information from both sides than could the center-mounted necklace camera.

Once the device is optimized, Zhang believes it could be particularly useful in the mental health realm for tracking people’s emotions over the course of a day. Although people don’t always wear their emotions on their face, he said, the amount of facial expression change over time could indicate emotional swings.

“Can we actually see how your emotion varies throughout a day?” he said. “With this technology we could have a database on how you’re doing physically and mentally throughout the day, and that means you could track your own behaviors. And also, a doctor could use the information to support a decision.”

The research was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (
Aug 2021
1. The process of following an object's movement; accomplished by focusing a radar beam on the reticle of an optical system on the object and plotting its bearing and distance at specific intervals. 2. In display technology, use of a light pen to move an object across a display screen.
Research & TechnologyimagingSensors & Detectorscamerasinfrarednear-infraredexpression monitoringfacial expressiontrackingCornell UniversityCornell Ann S. Bowers College of Computing and Information ScienceSciFi LabCheng Zhang

back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2023 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.