Close

Search

Search Menu
Photonics Media Photonics Marketplace Photonics Spectra BioPhotonics EuroPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook

In the Quest for a Healthier Grape, AI Uncorks a Defense

Facebook Twitter LinkedIn Email
JAKE SALTZMAN, NEWS EDITOR [email protected]

Powdery mildew is a costly problem. An outbreak of the plant disease could wipe out an entire greenhouse crop. Treating the mildew with fungicides can cost thousands of dollars, and such treatment may only work if the outbreak is detected early.

Among the crops most prone to a powdery mildew infection are grapes. A typical grape grower can spend about 10% of production costs just to manage the fungus.

This cost equates to a multibillion-dollar global headache in the global wine market alone, which is expected to exceed $435 billion by 2027, according to Research and Markets.

Research plant pathologist Lance Cadle-Davidson drives a vehicle equipped with a camera through a vineyard. He and his team, along with Cornell assistant professor Yu Jiang and his group, are developing AI algorithms to detect plant disease in vineyards after experiencing success using the Blackbird imaging robot to detect disease in the lab. Courtesy of Surya Sapkota.

 
  Research plant pathologist Lance Cadle-Davidson drives a vehicle equipped with a camera through a vineyard. He and his team, along with Cornell assistant professor Yu Jiang and his group, are developing AI algorithms to detect plant disease in vineyards after experiencing success using the Blackbird imaging robot to detect disease in the lab. Courtesy of Surya Sapkota.

Fortunately, powdery mildew is easily identified. The leaves of infected grape plants will exhibit the disease’s trademark white spores. Pathologists can usually confirm the diagnosis using conventional light microscopy.

While light microscopy methods are precise, they can be time-consuming. A typical experiment can involve as many as 2500 leaf samples, necessitating two months or more of manual microscopic analysis, said Lance Cadle-Davidson, a research plant pathologist at the U.S. Department of Agriculture Grape Genetics Research Unit in Geneva, N.Y. Cadle-Davidson is also an adjunct professor in the School of Integrative Plant Science (SIPS) at Cornell AgriTech, Cornell University’s center for agriculture and food research.

The large time expenditure poses a problem for labs such as Cadle- Davidson’s, which receive thousands of grape leaves to manually sample from around the world each year.

Precision capture

Cadle-Davidson is working with engineer Yu Jiang, an assistant professor in SIPS’ Horticulture Section at Cornell AgriTech, to expedite the diagnosis of powdery mildew via robotic imaging, computer vision, and deep learning.

The process begins with a robotic camera called BlackBird, which captures images of grape leaves at a scale of 1.2 µm per pixel.

“For each image of each leaf, we cut one section from the sample,” Jiang said. “It almost looks like a leaf disc, about 1 cm × 1 cm from each sample. Then we use BlackBird to acquire the image of that disc. Each image is 5000 × 8000 pixels. We separate that whole leaf disc sample into image patches that measure 224 × 224 pixels. And we label all of those patches.”

In other words, for each imaged 1-cm leaf disc section, the robotic camera provides 5000 × 8000 pixels of information.

A field robot developed in Jiang’s lab moves through a plot. Researchers are developing the technology to detect plant disease in vineyards and elsewhere in the field. Courtesy of Cyber-Agricultural Intelligence and Robotics Lab and Cornell AgriTech/Cornell University.

 
  A field robot developed in Jiang’s lab moves through a plot. Researchers are developing the technology to detect plant disease in vineyards and elsewhere in the field. Courtesy of Cyber-Agricultural Intelligence and Robotics Lab and Cornell AgriTech/Cornell University.

The scale and degree of image detail produced by BlackBird are equivalent to that which would be produced by an optical microscope, but BlackBird provides the images in a fraction of the time.

A model built to be understood

The information obtained by BlackBird provides Cadle-Davidson and his team with the raw-form data that they need to continually assess leaf samples and diagnose disease.

After the researchers collect the 1-cm image patches from the leaf samples, the patches are then broken down into subimages — creating ~800 subimages per patch. Then the researchers feed the subimages into a deep neural network developed by Jiang and his team.

Jiang trained the neural network on unique images — engineering it in the same way that other AI models are engineered for facial recognition and other vision tasks — using nearly 30,000 images of grape leaves. The model enables the Cornell pathologists to extract information from the collected data and compare it to their high volume of samples in a short period of time.

Jiang’s AI model greatly accelerated the image-to-diagnoses process. The pathologists needed only one day to complete experiments that would once have taken six months.

The technology relies on an aptly named concept called “explainable AI,” which prioritizes human comprehension by enabling the ability to decipher how and why an AI mechanism arrived at its deliverable.

“Deep neural networks that use a black box system work great,” Jiang said. “But nobody understands why.”

AI models are generally “black boxes,” meaning that the processes that the network uses to interpret a collection of information are unknowable by the user. When Jiang uses explainable AI, however, he is able to understand some of the principal considerations that the trained networks use to make predictions. He can pass this information along to the pathologists.

The risk associated with certain vision applications — such as those that involve pedestrian safety, for example — is inherently high, Jiang said. When sampling powdery mildew, the risk comes from the need to combine two completely different biological findings. A single percent accuracy loss in visual sample analysis in either finding could result in a misinterpretation or misunderstanding of data.

“Over time, that might cause millions of dollars in yield loss,” Jiang said. It is therefore invaluable for the pathologists to precisely understand how the trained networks make predictions.

A researcher from Cornell University’s School of Integrative Plant Science’s Plant Pathology and Plant-Microbe Biology Section uses a phenotyping robot to analyze powdery mildew on hops plants and grape leaves. Courtesy of Cyber-Agricultural Intelligence and Robotics Lab and Cornell AgriTech/Cornell University.

 
  A researcher from Cornell University’s School of Integrative Plant Science’s Plant Pathology and Plant-Microbe Biology Section uses a phenotyping robot to analyze powdery mildew on hops plants and grape leaves. Courtesy of Cyber-Agricultural Intelligence and Robotics Lab and Cornell AgriTech/Cornell University.

“Explainable AI also helps us to use the feature more creatively to quantify the disease to improve accuracy,” he said. “We can improve the trustworthiness of the model. We can build some trust between the human experts and the modeling process.”

Infrared possibilities

From the mid-1800s until about 2019, plant growers and pathologists, including in Cadle-Davidson’s group, detected powdery mildew using manual microscopy methods. Any alternative to this labor-intensive process was welcome. Cadle-Davidson made the first step toward a solution at a time when his group was collaborating with a company that manufactures 3D printers.

“We realized that if we took the printer head out of a 3D printer and replaced it with a camera, we could perform the automation that we needed to move from sample to sample in the x and y dimension and focus on the z-axis,” Cadle-Davidson said. “That is one of the biggest challenges with microscopy imaging when you are working at this level of magnification. It is hard to get the depth of field that is necessary to capture the topology of a leaf. We have to take several images and stack those to get one crisp, in-focus image.”

While Jiang optimized the neural network component, the task of Cadle-Davidson’s team was to tailor the optics for the application. Early imaging prototypes used microscope lenses found in the team’s lab.

The lenses proved insufficient to capture a full leaf disc in a single image.

“You can capture a leaf disc by sampling 10 different times and 10 different areas,” Cadle-Davidson said. “But that destroyed our throughput.”

The current iteration of BlackBird uses commercially available lenses and off-the-shelf components. The collaborators perfected the optics and the sensor to specifically detect powdery mildew on grape leaves.

Plant diseases such as powdery mildew are detected in the infrared range, often before the naked eye perceives their presence. This vision capability is guiding the next phase of BlackBird’s deployment.

“We are now working on what we are calling ‘HyperBird,’” Jiang said. “It is the BlackBird frame incorporated with a hyperspectral camera.”

The addition of a hyperspectral camera will extend the imaging spectrum from the RGB into the visible near-infrared (VNIR) band, which extends from about 400 to 1100 nm.

“It is a natural transition for this work because with the naked eye you can only see what color tells you or what texture tells you,” Jiang said. “Many chemical compounds, biophysical properties, and biochemical reactions reflect through different wavelengths.”

From the lab to the field

Powdery mildew affects a wide range of plants in addition to grapes, including melons, gourds, legumes, cereal grains, and flowers.

BlackBird is currently only a laboratory instrument, but Cadle-Davidson and Jiang envision that similar robotic and AI-enabled systems could be used in a vineyard or elsewhere in the field to detect plant disease in a variety of plants.

Jiang and his group have developed a robot called PhytoPatholoBot (PPB) that uses a strobe light-enabled stereo camera designed for field imaging. Images from the robotic camera quantify the infection area of a sample.

The PPB system could allow pathologists to evaluate the data that they obtain via a laboratory instrument by comparing the results to those from Jiang’s system. Combining data sets will help to build successful AI models.

“Whatever resolution we deploy in the field, that will help us make some sort of link between field performance and laboratory performance,” Jiang said.

The information will specifically help to determine the gap between field performance and laboratory performance, and from a science perspective, Jiang said, to determine the cause of the gap. Once he obtains and provides the two data sets — one from the fundamental laboratory data and one from the field data — the pathologists can begin to comprehend the AI-based processes that they are seeing. From there, they can then understand why the explainable AI model is delivering the pathological information in either setting, in certain situations.

Many traditional models designed for similar data evaluation applications are considered “blind,” Jiang said. These models are unable to make intelligent use of previously collected laboratory information.

“We all know that these models won’t be reliable. If you change to a different vineyard, go back the next year, you are going to get different images leading to different predictions,” he said. “If we can link back to our fundamental knowledge and determine why certain developments are driving the optical changes that we see, we will have much greater confidence to say, ‘This is a reliable model for field evaluation.’”

As work on the AI front continues, Cadle-Davidson’s team is applying the current BlackBird technology to other commercially valuable crops beside grapes. A grant secured last summer will send the group to the Pacific Northwest to diagnose powdery mildew on hops plants. The team is also working with a group in North Dakota to assess damages to oats caused by oat rust.

“There are a number of invasive diseases that we expect are going to cause huge agricultural challenges,” Cadle-Davidson said. “Those are some of the areas we are really looking to help in. Hyperspectral imaging is an exciting field.”

“Right now, BlackBird can only see what we see,” he said. “But there is a lot of important biology that happens outside of the visible spectrum.”

Vision Spectra
Spring 2022
Vision in Actionneural networksAIcamerasagriculturerobotics

news
Submit a Feature Article Submit a Press Release
Terms & Conditions Privacy Policy About Us Contact Us
Facebook Twitter Instagram LinkedIn YouTube RSS
©2022 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, [email protected]

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.