Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Camera-Based Monitoring of Vital Signs Improved

Refined signal processing allows video cameras to monitor vital signs regardless of ambient lighting and the subject’s skin color.

A new algorithm developed by researchers at Rice University detects subtle variations in skin tone caused by blood circulation to determine pulse and breathing rates.

The idea of using a camera to track vital signs is based on photoplethysmography (PPG), a way to measure physiological processes under the skin by monitoring subtle changes at the skin’s surface. Camera-based PPG has been studied previously, but its applications were limited because it did not work reliably unless subjects were fair skinned and sitting perfectly still in a well-lit room.



The PPG signal is extracted from four regions marked on the face. A weighted average of the four readings compares well with a reading from a pulse oximeter. Courtesy of The Optical Society.


The Rice algorithm, called DistancePPG, overcomes these problems by measuring the eyes, nose and mouth separately. A weighted average, based on blood perfusion and incident light intensity, is applied to the data to estimate vital signs.

“Interestingly, this technique has been known in other domains of computer vision, but has not been properly applied to the problem at hand,” said graduate student Mayank Kumar. “Once we understood the motion challenge, the tracking approach became obvious.”

To test their new algorithm, the researchers monitored adults engaging in common activities. Results were compared to readings from pulse oximeters attached to subjects’ earlobes.

The algorithm improved the PPG signal in situations with low levels of motion, such as when subjects were reading or watching a video. However, it remained relatively inaccurate when subjects were talking or smiling. These larger movements changed the facial light reflectance more dramatically and made extracting a reliable signal difficult, the researchers said.

Noncontact methods for monitoring vital signs are especially desirable in neonatal intensive care units, where repeatedly attaching and removing monitors can injure premature babies and leave them susceptible to infection.

If the motion problems can be overcome, the researchers said, the technique could even find its way into health-tracker applications for smartphones and computers.

The work was published in Biomedical Optics Express (doi: 10.1364/BOE.6.001565).

For more information, visit www.rice.edu.

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media