Close

Search

Search Menu
Photonics Media Photonics Marketplace Photonics Spectra BioPhotonics EuroPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

Machine Vision, Artificial Nose Combine to Monitor Cooked Chicken

Facebook Twitter LinkedIn Email
Skoltech (Russia) scientists are working to combine machine vision with an artificial nose to ensure the proper level of doneness for cooked chicken. The technology aims to help restaurants monitor and automate cooking processes.

In pursuit of the perfect chicken, the researchers employed an industrial camera and an array of sensors (called an e-nose) designed to detect the presence of certain components of an odor. These devices monitor the chicken as it cooks — looking and smelling to determine when it is fully cooked.

The work evolved from a student project at the lab by Ainul Yaqin, the co-author of the research paper. Yaqin traveled to Novosibirsk to test the ability of chemical sensors developed by the lab to monitor the effectiveness of industrial filters in restaurant ventilation. That project led to experiments with the smell profile of grilled chicken.

“At the same time to determine the proper doneness state, one cannot rely on ‘e-nose’ only, but have to use computer vision — these tools give you a so-called  electronic panel (a panel of electronic ‘experts’),” said Albert Nasibulin, a professor at Skoltech and Aalto University (Finland). “Building on the great experience in computer vision techniques of our colleagues from Skoltech CDISE, together we tested the hypothesis that, when combined, computer vision and electronic nose provide more precise control over the cooking.” 

The researchers combined techniques to accurately observe chicken as it cooked, without contact. The researchers chose chicken because of its global popularity, grilling it extensively to “train” their instruments to evaluate and predict how well it was cooked.

“Images of grilled chickens were obtained using the DFK 33UX250 industrial camera,” said Fedor Fedorov, senior research scientist at Skoltech’s Center for Photonics and Quantum Materials. “We employed the RGB color model in our analysis. It is a color model with 8-bit images, and an integer number is determined for each pixel in the range of 0 to 255. RGB color depends on a mixture model where colors are created using the combination of these colors, namely red (R), green (G), and blue (B). RGB values were taken as features.”

The researchers tested other dimensionality reduction techniques such as linear discriminant analysis, latent Dirichlet allocation, and t-distributed stochastic neighbor embedding for the analysis of obtained images.

The e-nose comprised eight sensors that detected smoke, alcohol, carbon monoxide, and other compounds, as well as temperature and humidity; the researchers placed the e-nose inside the ventilation system. They fed photos taken by the camera into an algorithm that looks at data patterns, and, to define changes in odor consistent with the various stages of the grilling process, the researchers made use of thermogravimetric analysis to monitor the number of volatile particles for the e-nose to detect, differential mobility analysis to measure the size of aerosol particles, and mass spectrometry.

The researchers employed 16 Ph.D. students and researchers to taste-test the grilled chicken breast to rate its tenderness, juiciness, intensity of flavor, appearance, and overall doneness on a 10-point scale. This data was matched to the analytical results to test doneness against the humans’ perception.

Using these techniques, the team reported that their system accurately identified undercooked, well-cooked, and overcooked chicken breast. For the system to work with other cuts of the chicken, the researchers would need to retrain the system on new data.

“We believe we can use other techniques of data handling, i.e., artificial neural networks. Also, the application of a multispectral camera might help to improve the results. We can also consider high-level data fusion, while in the paper, low-level data fusion was used,” Fedorov told Photonics Media.

The researchers plan to test their sensors in restaurant kitchen environments. Another potential application could be in sniffing out rotten meat at its early stages of spoilage, when changes in the smell profile are too subtle for human perception.

“We believe these systems can be integrated into industrial kitchens and even in usual kitchens as a tool that can help and advise about the doneness degree of your meat, when direct temperature measurement is not possible or not effective,” Fedorov said.

The research was published in Food Chemistry (www.doi.org/10.1016/j.foodchem.2020.128747).

 


Vision-Spectra.com
Jan 2021
GLOSSARY
machine vision
Interpretation of an image of an object or scene through the use of optical noncontact sensing mechanisms for the purpose of obtaining information and/or controlling machines or processes.
Research & Technologymachine visionquality controlfood qualityfood quality controlFood safetyfood safety inspectionautomated inspectionindustrial cameraimaginge-noseEuropeSkoltechSkolkovo Institute of Science and Technology

Comments
Submit a Feature Article Submit a Press Release
Terms & Conditions Privacy Policy About Us Contact Us
Facebook Twitter Instagram LinkedIn YouTube RSS
©2021 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.