Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Smart Necklace Tracks Facial Expressions

Cornell University researchers have developed a smart necklace wearable device capable of tracking facial expressions using light. The device, NeckFace, continuously tracks full facial expressions using infrared cameras to capture images of the chin and face from beneath the neck.

Cheng Zhang, assistant professor of information science in the Cornell Ann S. Bowers College of Computing and Information Science, led the work. NeckFace builds on a previous device developed by Zhang, C-Face, which worked in a similar fashion but was worn as a headset. The new device provides significant improvement, Zhang said, both in performance and privacy, and gives the wearer the option of a less-obtrusive neck-mounted device.

A close-up of the NeckFace wearable sensing technology hardware, which includes an infrared camera, a near-infrared LED, and an IR narrow band-pass filter. Courtesy of SciFi Lab.

As well as potential emotion-tracking, Zhang envisions a number of applications for the technology: virtual conferencing when a front-facing camera is not an option, facial expression detection in virtual reality scenarios, and silent speech recognition.

“The ultimate goal is having the user be able to track their own behaviors, through continuous tracking of facial movements,” said Zhang, principal investigator at the Smart Computer Interface for Future Interactions (SciFi) Lab. “And this hopefully can tell us a lot of information about your physical activity and mental activities.”

On video conferencing, François Guimbretière, professor of information science in the Cornell Bowers College, said, “The user wouldn’t need to be careful to stay in the field of view of a camera. Instead, NeckFace can re-create the perfect headshot as we move around in a classroom, or even walk outside to share a walk with a distant friend.”

Zhang and his collaborators conducted a user study with 13 participants, each of whom was asked to perform eight facial expressions while sitting and eight more while walking. In the sitting scenarios, the participants were also asked to rotate the head while performing the facial expressions, and remove and remount the device in one session.

NeckFace was tested in two designs: a neckband draped around the back of the neck with twin cameras just below the collarbone level, and a necklace with a pendant-like infrared (IR) camera device hanging below the neck.

Baseline facial movement data was collected using the TrueDepth 3D camera on an iPhone X and compared to the data collected with NeckFace. Between the sitting, walking, and remounting expressions, study participants displayed a total of 52 facial shapes.

Using calculations involving deep learning, the group determined that NeckFace detected facial movement with close to the same accuracy as the direct measurements using the phone camera. The neckband proved more effective than the necklace, the researchers found, possibly because two cameras on the neckband could capture more information from both sides than could the center-mounted necklace camera.

Once the device is optimized, Zhang believes it could be particularly useful in the mental health realm for tracking people’s emotions over the course of a day. Although people don’t always wear their emotions on their face, he said, the amount of facial expression change over time could indicate emotional swings.

“Can we actually see how your emotion varies throughout a day?” he said. “With this technology we could have a database on how you’re doing physically and mentally throughout the day, and that means you could track your own behaviors. And also, a doctor could use the information to support a decision.”

The research was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (www.doi.org/10.1145/3463511).

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media