Close

Search

Search Menu
Photonics Media Photonics Marketplace Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

Satellite Imaging and Field Cameras Monitor Corn Production

Facebook Twitter LinkedIn Email
URBANA, Ill., March 10, 2020 — Scientists from the University of Illinois have developed a scalable method of estimating crop productivity with satellite imagery. The research combines field measurements, a unique in-field camera network, and high-resolution, high-frequency satellite data, providing highly accurate productivity estimates for crops across Illinois and beyond.

The researchers used reflectance data, which measures light bouncing off Earth, from two types of satellites to estimate leaf area index (LAI) in agricultural fields. Both data sets represent major improvements over older satellite data; they are capable of imaging Earth at a fine scale (3-m or 30-m resolution) and both return to the same spot above the planet on a daily basis. Since the satellites don’t capture LAI directly, the research team developed two mathematical algorithms to convert surface reflectance into LAI.
University of Illinois doctoral student Hyungsuk Kimm set up a network of cameras in corn fields around Illinois to ground-truth satellite-based algorithms to monitor corn productivity in real time. Courtesy of Hyungsuk Kimm, University of Illinois.
University of Illinois doctoral student Hyungsuk Kimm set up a network of cameras in cornfields around Illinois to ground-truth satellite-based algorithms to monitor corn productivity in real time. Courtesy of Hyungsuk Kimm, University of Illinois.

“Our ultimate goal is to provide useful information to farmers, especially at the field level or subfield level,” said Hyungsuk Kimm, a doctoral student in the Department of Natural Resources and Environment Sciences (NRES) at the University of Illinois and lead author on the study. “Previously, most available satellite data had coarse spatial and/or temporal resolution, but here we take advantage of new satellite products to estimate leaf area index, a proxy for crop productivity and grain yield. And we know the satellite estimates are accurate because our ground measurements agree.”

While developing the algorithms to estimate LAI, Kimm worked with Illinois farmers to set up cameras in 36 cornfields across the state, providing continuous ground-level monitoring. The images from the cameras provided detailed ground information to refine the satellite-derived estimates of LAI.

The true test of the satellite estimates from the two algorithms strongly agreed with Kimm’s “ground truth” data from the fields. This result means the algorithms delivered highly accurate, reliable LAI information from space and can be used to estimate LAI in fields anywhere in the world in real time.

“We are the first to develop scalable, high-temporal, high-resolution LAI data for farmers to use,” said Kaiyu Guan, assistant professor in the Department of NRES and Blue Waters professor at the National Center for Supercomputing Applications. He is also principal investigator on the study. “These methods have been fully validated using an unprecedented camera network for farmland.”

Having real-time LAI data could be instrumental for responsive management. For example, the satellite method could detect underperforming fields or segments of fields that could be corrected with targeted practices such as nutrient management, pesticide application, or other strategies. Guan plans to make real-time data available to farmers in the near future.

“The new LAI technology developed by Dr. Guan’s research team is an exciting advancement with potential to help farmers identify and respond to in-field problems faster and more effectively than ever before,” said Laura Gentry, director of water quality research for the Illinois Corn Growers Association.

According to Gentry, more accurate measurements of LAI can help farmers to be more efficient, timely, and therefore more profitable.

The research was published in Remote Sensing of Environment (www.dx.doi.org/10.1016/j.rse.2019.111615). 


Photonics.com
Mar 2020
GLOSSARY
machine vision
Interpretation of an image of an object or scene through the use of optical noncontact sensing mechanisms for the purpose of obtaining information and/or controlling machines or processes.
Research & Technologymachine visioncrop managementsatellite imagerysatellite imagessatellite imagingsatellite imaging technologysatellite imaging systemssatellitecamerasimagingagriculture

back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2023 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.