Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Machine Learning Reduces Drift in Fiber Sensors

Using machine learning and an established mathematical model, scientists at the National Institute of Standards and Technology (NIST) developed a machine learning-based algorithm to predict drift in existing fiber Bragg grating (FBG) temperature sensors. The algorithm could be used to reduce the effect of drift without the need to purchase or develop new sensor technology.

“It’s an alternative approach where you can have your cake (keep the existing technology) and eat it too (reduce the contribution of long-term drift),” researcher Zeeshan Ahmed said. “Fiber Bragg-grating sensors are cheap. Rather than spending five years to develop better materials, why not just use this algorithm, or a similar one in this family of algorithms?”

FBG temperature sensors, embedded into structures, are used routinely in civil infrastructure and in the oil and gas industries. However, they are not accurate enough for certain other applications, such as monitoring temperatures in medical-grade refrigerators.

One significant hit to the sensors’ accuracy comes from long-term drift. Bragg sensors work by manipulating the interaction between light and the structures etched into a fiber optic cable. When the Bragg sensor is exposed to high temperatures over time, the refractive index of the fiber changes. The change in refractive index is thought to be the cause of drift errors.

Recalibrating the sensor every few months corrects the problem; but this can be expensive and time-consuming, especially if the sensor is buried in concrete or otherwise embedded permanently in a structure. Additionally, long-term drift in Bragg sensors causes uncertainties in temperature measurement ranging from 200 to 300 mK — the equivalent of a third to a half a degree Fahrenheit.

“To be competitive with existing technology, you want to get that down to about half that value, and if possible, to a few tens of millikelvin,” Ahmed said.

Ahmed and the team studied the temporal evolution of the temperature response of 14 sensors that were repeatedly cycled between 233 and 393 K. After evaluating various calibration models, they determined that a dynamic regression model could effectively reduce measurement uncertainty due to drift by up to about 70%. This could be sufficient for studying some processes, such as industrial fermentation, that rely on temperature control.

Further, they found that the total amount of light reflected by the grating and the intensity of light at each wavelength were both helpful in predicting future drift. The history of the sensor — for example, how fast it was heated or cooled, or how high its temperature was in the hours leading up to a change in temperature — was also useful in forecasting drift.

An early prototype of a chip-based photonic thermometer. The sensor is built into the chip, while light enters and exits the sensor via optical fibers. A recently developed approach improves the accuracy of fiber Bragg grating sensors without need for recalibration. Courtesy of Jennifer Lauren Lee/NIST.
The researchers applied Autoregressive Integrative Moving Average (ARIMA) models, a class of mathematical models created in the 1970s, to reduce measurement uncertainties. “I’m not using the most advanced technique,” Ahmed said. “Even the older methods can give you a lot of information.”

The downside of this model is that it only works for short-term drift occurring over a few weeks rather than over months or years.

NIST researchers are developing chip-based photonic thermometers that, compared to traditional thermometry techniques, promise to be smaller and more durable, resistant to electromagnetic interference, and potentially self-calibrating. These sensors are still in the testing phase.

Ahmed’s original intent was simply to help scientists understand the drift problem better. “I thought if I can understand the direct process and compensate for it mathematically, then I can reduce these uncertainties to an acceptable level,” he said. His proof-of-concept work shows how machine learning can enable researchers to make self-calibrating or self-correcting sensors using existing technology.

A second paper, still in review, describes Ahmed’s work to make a true physics-based model that lays out the fundamental relationship between wavelength and temperature in both fiber- and chip-based photonic thermometers.

“That would be even better,” he said. “If we have a physics-based model, then we can describe how the physics is changing over time and how that’s causing the changes in the calibration of these devices. And then we could really understand and quantify what’s happening to your sensor.”

The research was published in Sensors and Actuators A: Physical (www.doi.org/10.1016/j.sna.2022.113872).

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media