Optical Sensors Advancing Precision in Agricultural Production

Facebook X LinkedIn Email
Emerging methods for plant phenotyping involve optical sensors — from simple RGB image sensors to NIR and Raman spectroscopy.


Agricultural production, already incredibly efficient in much of the developed world, is increasing each year — this is a necessity to feed the 9+ billion people expected on the planet by 2050. A great example of productivity gains is in corn (i.e., maize, Zea mays). One of the most productive crops, corn’s yield has increased sevenfold over the last 100 years in the U.S., meaning that seven times less land is needed to feed the same number of people1. While favorable soils and weather have always played a central role in agriculture productivity, the increases in productivity have been driven by farmers’ use and adaptation of improved technologies. Despite past successes, agriculture continues to be challenged to “produce more with less” with farm inputs such as phosphorus and irrigation water, which are decreasing in availability and increasing in costs. These challenges are compounded by a changing climate and a need to ensure practices are sustainable both economically and environmentally. As in the past, agriculture will meet new obstacles that arise through research discoveries and new production tools. Unlike in the past, tools will not be exclusively the domain of agronomists, plant breeders, agricultural engineers and other traditional agricultural scientists; finding solutions will require input and integration from diverse disciplines not previously involved in agriculture, such as photonics and computer science. New sensor and imaging technologies are among the most important and exciting new tools that will be incorporated to see and respond to what has previously been intractable. Photonics will help researchers and farmers apply discoveries on the farm to more rapidly, cheaply and accurately manage crops.

Sensors are being incorporated into irrigation systems to apply water based on spectral reflectance.

Sensors are being incorporated into irrigation systems to apply water based on spectral reflectance. Courtesy of U.S. Department of Agriculture.

Quantifying soil properties

The land area devoted to agriculture is vast but not homogenous. When zooming from landscape to plant scale at any point in time, important differences emerge that can be difficult to quantify. Sensors are needed to constantly monitor agricultural systems at the highest accuracy feasible. At the coarsest view, the agricultural systems and the specific crops grown in a landscape change across the globe, primarily across latitudes. Agricultural fields within these landscapes, as seen from airplanes and satellites in space, appear as patchworks of different crops adjacent to each other. Adding the dimension of time, specific crops in each field rotate from year to year and biologically reach different stages of a life cycle within a year; each locational and temporal difference results in specific growth requirements to optimize. Zooming in a little closer, there are gradients within fields that have different colors. These gradients are correlated with difference in soil type and abilities of the soil to hold nutrients and moisture. Red, green, blue (RGB), near-infrared spectroscopy (NIRS) and other sensors can quantify these properties throughout the soil profile2. Soil conditions interact with changing weather to impact crop growth; for instance, yield productivity is reduced by flooding in low spots and/or clay soil in a wet year, but is reduced by drought in high spots and/or sandy soils in dry years. Soil is formed by different processes over tens of thousands of years and might have bands of sand or clay running through a field in ways that intersect with differences in elevation and microclimate. These and other factors interact to create unique growth conditions for each plant in the field that need to be objectively observed and characterized before they can be managed. While soil amendments to reduce such differences can be used in a personal garden, soil within a farm field or a landscape is impractical to homogenize and must be dealt with as it exists. Unlike the tightly controlled and consistent environments in sensor equipment manufacturing, agricultural production is subject to the biological whims of the crop, the vagaries of the weather and the sovereignty of disconnected markets. Variation must be measured or estimated instead of eliminated — then the crop can be managed for maximum productivity, economic returns and/or sustainability. This is the objective of precision agriculture and where photonics can play a role.

Modern farming and precision agriculture incorporate a large amount of technology including on tractors, sprayers and combines.

Modern farming and precision agriculture incorporate a large amount of technology including on tractors, sprayers and combines. Courtesy of U.S. Department of Agriculture.

What is precision agriculture?

The major goal of precision agriculture is smarter applications of fertilizer, pesticides and irrigation water to maximize efficiency of production and on-farm profit for the plant variety chosen.

Increasingly, these smarter applications have also meant being better stewards of the environment. The over-application of fertilizers or pesticides to areas of a field that do not need them can pollute surface and ground water and cost farmers more per acre, without generating any additional crop yield or quality. Conversely, under-application of inputs reduces overall crop production, which can lead to a financial loss after covering the costs of seed, equipment and land. This delicate balance between over- and under-application differs not just between counties and fields, but can differ incredibly within a field, even plant to plant, and this plant heterogeneity reduces yield3.

Nitrogen is among the most widely applied nutrients in agricultural fertilizers and was among the first precision agriculture success stories. Farmers now routinely use multispectral and crop canopy sensors to apply this input more precisely. It has long been known that nitrogen uptake is directly related to the amount of biomass — a normalized difference vegetative index (NDVI = [NIR− red]/[NIR+red]) calculated from reflectance spectra can approximate the “greenness” of the crop, which in turn can indicate crop stress and plant accesses to nitrogen. Active (with their own light source) proximal crop reflectance sensors for NDVI such as Trimble Ag’s GreenSeeker or Holland Scientific’s Crop Circle have been coupled with variable rate fertilizer controls so that the precise amounts of nitrogen fertilizers can be applied where they are needed in the field. The wavelengths used for NDVI vary by company and sensor and may include other indices than NDVI. In general, red is around 660 nm, NIR is around 770 nm, and red edge is around 735 nm4.

Active NDVI proximal crop reflectance sensors, along with ultrasonic and temperature sensors, are mounted on a high-throughput phenotyping vehicle.

Active NDVI proximal crop reflectance sensors, along with ultrasonic and temperature sensors, are mounted on a high-throughput phenotyping vehicle. The same type of NDVI sensors are often attached to a fertilizer rig. Courtesy of Beth Ann Luedeker, Department of Soil and Crop Sciences, Texas A&M University.

Unmanned aerial vehicles

Among new technologies generating the highest interest within agriculture right now are UAVs, fitted with cameras and other imaging sensors (thus becoming unmanned aerial systems, UASs). The value this affords for farmers and crop consultants is in scouting fields for drought stress, pests and disease in a faster and more timely manner, over a larger area, and more systematically than is currently done by manually “walking a field.” For most row crops, the goal of scouting is to find aspects of stress that can be managed with inputs. If a farmer has no way to manage a stress, they do not need to know about it and will not be willing to pay for this service. Once stress indicators are identified, the particular stress would require more in-depth manual investigation. Automated sampling methods or higher resolution to diagnose the problem could be available in the future. There are currently two primary types of UAS vehicles being investigated for field-scouting: fixed-wing (plane) aircraft that fly fast and high and copter aircraft that fly slow and low but have much better image resolution. Higher resolution will aid in observation of important things such as which insects and disease lesions are causing issues, which can help to more quickly diagnose the specific problem. However, in addition to being slower and more costly, this also results in more data to have to make decisions from and store.

A UAV copter surveys a field for plant height and shapes at the same time as a lidar (on spray tractor).

A UAV copter surveys a field for plant height and shapes at the same time as a lidar (on spray tractor). Courtesy of Dr. Sorin Popescu, LASERS Laboratory, Texas A&M University.

A few examples of successful UAS research field problems include the detection of cotton root rot — a nearly intractable soil issue leading to death in cotton — for replanting and fungicide control purposes5, the detection of drought yield losses in bean6 that could be rescued through irrigation and the early detection of a leaf rust epidemic in maize that could be controlled by fungicide (Figure 1). However, despite these early research successes and the fact that UASs can be cheaply purchased and that their use in agriculture is highly publicized, there are so far few examples of UASs creating actionable decisions by farmers. Substantial research and education still needs to be done before this can become a reality7. Farmers are finding that there are many steps between taking pictures and extracting usable information, and only some of the steps are commonly known. While the current limiting factors of software, analysis and interpretation of UAV data are solved, new data collection needs will emerge. A major need is for less-expensive and lighter-weight imaging sensors that have faster shutter speed (to reduce pixel smear). Better imaging will further decrease costs and be useful for plant scouting. It should be noted that some companies are already betting that satellites will provide high-enough resolution with frequent-enough revisit time, so that many drawbacks from using UAVs can be avoided.

Comparing leaf rust disease images with visual ratings 1.5 and 8.5 (dead plants are rated a 10) on two varieties of corn in Corpus Christi, Texas, taken from a UAV.

Figure 1.
Comparing leaf rust disease images with visual ratings 1.5 and 8.5 (dead plants are rated a 10) on two varieties of corn in Corpus Christi, Texas, taken from a UAV. Courtesy of M. Starek ,Texas A&M University-Corpus Christi, and G. Odvody, Texas AgriLife Research.

High-throughput phenotyping

Phenotyping is the act of measuring the physical characteristics of plants and animals. A major goal in plant, animal and human research is to identify genes that control phenotype variation. This requires measuring the DNA and the phenotype of the same organism. While the DNA sequence remains constant, the phenotype can vary dramatically from environment to environment and even within the same field — for example, flood stunting of plants in low spots. The radical improvement in DNA sequencing is one of the few technologies with a rate of advancement exceeding Moore’s law for semiconductors and is also partially due to improved imaging8. Improvement of phenotype measurement has not kept up and now is often the limiting factor in many genetic studies. High-throughput phenotyping is the scientific community’s response, developing faster and cheaper methods of measuring the physical characteristics of a plant. Many of the phenotyping methods emerging involve optical sensors, from simple RGB image sensors to measure plant structures to NIR and Raman spectroscopy point sensors9 and, increasingly, lidar and multispectral and hyperspectral imaging10. Because this research is transdisciplinary and rapidly emerging from disparate disciplines — including basic plant biology, computer vision, engineering and others — new forums for discussing research advancements are emerging, such as The Plant Phenome Journal, allowing discoveries and standards to be exposed to the widest possible audience. Some of the technologies emerging from high-throughput phenotyping will likely be scalable and eventually incorporated into precision agriculture uses. While these phenotyping sensors result in faster and cheaper methods of measurement, they are not always more accurate than manual measurements made by traditional methods. So how important is accuracy in measuring crops?

Approaches in agricultural sensing

Agricultural, mechanical and aeronautical engineers often express concerns about accuracy of new sensor and image analysis measurements, but this is less of a concern for biological science interests in phenotyping relative differences, not absolute differences, as long as the results are consistent. Differences in metrics used between engineering and agricultural researchers for assessing success are part of the learning curve of UAS interdisciplinary work7. There appear to be two general philosophical approaches that agricultural research and production take with sensor measurements. The first is the comparison with a “gold standard” measurement. For example, comparing the estimates of plant heights from a UAS image with measurements made manually using a ruler, a simple correlation analysis can be conducted to tell if this is a good measurement or not — or can it? If the correlation is good, some will say that the sensor is “as good or better than manual measurements,” whereas if the correlation is poor, many will blame the manual measurements and say that they are inaccurate due to “human error,” with the assumption that the sensor is more accurate.

In contrast, agricultural research can take a different approach that does not require having standard measurements from which to compare. From an agricultural field scientist’s perspective, the first step to evaluate a new photonics sensor estimate may not be to do a correlation but instead determine if the difference between varieties or treatments (drought, disease, etc.) are greater than the difference from measurement error. If so, this can be monitored across years and environments for relationships to yield and sustainable production. This is done in most experiments by including multiple replicates within a field and running restricted maximum likelihood (REML) models to compare how consistent different replications of the same variety and treatment are compared with the error (unexplained) variation between replicates. The measurements that have the least unexplained error variation are the more useful. This approach has found that some days the manual ruler measurements are more consistent for plant height while on other days the UAS sensor estimates are more consistent (it is not known why, but it suggests an interesting line of future research). So while it is tempting to envision sensors as just automating manual measurements that are already made, it is more exciting to envision how photonics could predict crop aspects impractical or impossible to have considered previously — for instance, if a reflectance wavelength can predict a specific disease or deficiency previously unknown. A slight variation on this approach is starting to be used with machine learning, one of the most exciting new analysis methods to be applied to these massive agricultural photonics datasets11.

What types of sensors are needed?

There are two main niches of desirable new sensors in the short term for field-based agriculture: improved versions of existing sensors and novel research sensors that can be deployed to measure new phenotypes. First, improved versions of existing sensors should be able to consistently and quickly collect higher-resolution RGB (or preferably multispectral for NDVI) images over increasingly larger areas, being both ruggedized and inexpensive (UAVs frequently crash). Improvements from physically larger sensors, such as better resolution and faster shutter speeds, would allow higher and faster UAS flights for precision agriculture and perhaps eventually satellites. For research phenotyping use of existing sensors, we would like to fly over 500 feet, faster than 30 miles per hour and still get sub-centimeter accuracy on the ground. No sensor to date has this capability, and this resolution is far more than precision agriculture currently requires. Improved sensors will allow agriculture to pixelaborate crops at a level of granularity never before feasible and reach the goal of individual plant management.

Fixed-wing UAVs are more appropriate for surveying farmers’ fields than rotary wings because they can fly faster and higher, allowing more land to be covered for lower cost.

Fixed-wing UAVs are more appropriate for surveying farmers’ fields than rotary wings because they can fly faster and higher, allowing more land to be covered for lower cost. Courtesy of Shay Simpson, Texas A&M Agrilife Research.

For developing novel sensors, extensions of Raman spectroscopy, x-ray fluorescence and hyperspectral spectroscopy across the electromagnetic spectrum would be desirable to detect new compounds from a UAS, but these still would need to be rugged, small enough and with low power requirements to be deployed on a UAV. Sensors could be tuned to detect important hormone or signaling compounds that could indicate stress before other physiological signs are observed9, or, more excitingly, could use an open-architecture approach discussed above to first find repeatable measures then determine if these never-before-detected signatures correspond to something useful. Research is also starting to look below ground, using technologies like ground-penetrating radar, magnetic resonance imaging, thermoacoustic imaging and x-ray to measure roots. It is critical to recall that it is less important to know what is being measured with these sensors than to demonstrate that the measures are repeatable. Therefore, nearly any new sensor is worth trying in a crop system to identify new properties to be measured in plants.

Challenges of disrupting agriculture

With many opportunities to apply technology to agriculture, it has recently become fashionable for many companies, both established but especially technology startups, to discuss technology’s ability to “disrupt agriculture.” It is tempting to naively look at the number of acres involved in agriculture and multiply that by a very small cost and envision very large profits. However, time and time again it has been demonstrated that a real value must be generated for farmers to adopt a new technology regardless of how small the cost, and even when profitable, they still may not adopt it12. There are also many unique business challenges to working in agriculture, such as only having one season of each crop per year, uncontrollable experimental losses from weather, the need for a counter-season business strategy when there is no crop in the ground and a regionalism in crops that prevents scaling. For anyone interested in applying a new technology — photonics and otherwise — it is strongly suggested to first find agricultural experts in the area with whom to collaborate. Land-grant universities are a great place to start because they typically have basic, applied, translational and extension research experiments occurring simultaneously that can easily leverage testing of any nondestructive sensor technology.

A quadcopter flies above crops to image them in Corpus Christi, Texas.

A quadcopter flies above crops to image them in Corpus Christi, Texas. Many different types of imaging sensors are being tested on UAVs. Courtesy of Dr. Jinha Jung, School of Engineering and Computer Sciences, Texas A&M University-Corpus Christi.

While the huge potential impact from applying photonics to agriculture is exciting, the promise of the technology’s potential alone won’t solve the problem of sustainably feeding 9+ billion people. Having new sensors and interdisciplinary cooperation to address agriculture’s needs is increasingly urgent.

Meet the author

Dr. Seth C. Murray is an associate professor and the Eugene Butler Endowed Chair in the Department of Soil and Crop Sciences at Texas A&M University. He recently founded and is editor of The Plant Phenome Journal.


1. E.C. Brummer et al. (2011). Plant breeding for harmony between agriculture and the environment. Front Ecol Environ, Vol. 9, Issue 10, pp. 561-568.

2. J.P. Ackerson et al. (2017). Penetrometer-mounted VisNIR spectroscopy: Application of EPO-PLS to in situ VisNIR spectra. Geoderma, Vol. 286, pp. 131-138.

3. K.L. Martin et al. (2005). Plant-to-plant variability in corn production. Agron J, Vol. 97, Issue 6, pp. 1603-1611.

4. L.K. Sharma et al. (2015). Active-optical sensors using red NDVI compared to red edge NDVI for prediction of corn grain yield in North Dakota, USA. Sensors, Vol. 15, Issue 11, pp. 27832-27853.

5. Y. Huang, and S.J. Thomson (2015). Remote sensing for cotton farming, in Cotton, 2nd. ed. Agronomy Monograph 57, pp. 439-464.

6. J.J. Trapp et al. (2016). Selective phenotyping traits related to multiple stress and drought response in dry bean. Crop Sci, Vol. 56, Issue 4, pp. 1460-1472.

7. Y. Shi et al. (2016). Unmanned aerial vehicles for high-throughput phenotyping and agronomic research. PloS One, Vol. 11, Issue 7, e0159781.

8. J.M. Rothberg et al. (2011). An integrated semiconductor device enabling non-optical genome sequencing. Nature, Vol. 475, Issue 7356, pp. 348-352.

9. N. Altangerel et al. (2017). In vivo diagnostics of early abiotic plant stress response via Raman spectroscopy. Proc Natl Acad Sci USA, Vol. 114, Issue 13, pp. 3393-3396.

10. J.L. Araus and J.E. Cairns (2014). Field high-throughput phenotyping: the new crop breeding frontier. Trends Plant Sci, Vol. 19, Issue 1, pp. 52-61.

11. A. Singh et al. (2016). Machine learning for high-throughput stress phenotyping in plants. Trends Plant Sci, Vol. 21, Issue 2, pp. 110-124.

12. D. Schimmelpfennig (2016). Farm Profits and Adoption of Precision Agriculture (No. 249773). United States Department of Agriculture, Economic Research Service.

Published: May 2017
multispectral imaging
Multispectral imaging is a technique that involves capturing and analyzing images at multiple discrete spectral bands within the electromagnetic spectrum. Unlike hyperspectral imaging, which acquires data across a continuous range of wavelengths, multispectral imaging is characterized by capturing information at several specific, predefined bands. This allows for the extraction of spectral signatures and information from different parts of the spectrum. Key aspects of multispectral imaging...
Lidar, short for light detection and ranging, is a remote sensing technology that uses laser light to measure distances and generate precise, three-dimensional information about the shape and characteristics of objects and surfaces. Lidar systems typically consist of a laser scanner, a GPS receiver, and an inertial measurement unit (IMU), all integrated into a single system. Here is how lidar works: Laser emission: A laser emits laser pulses, often in the form of rapid and repetitive laser...
FeaturesLasersimagersagriculturePrecision agricultureoptical sensorsRGB image sensorsmultispectral imaginglidarhigh-throughput phenotypingUAVsUASsnormalized difference vegetative indexNDVITrimble AgHolland ScientificNIRTexas A&Magriculture sensingSeth Murray

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.