Bugs are more than just a nuisance. Many do substantial damage – and not just to our serenity. Insects, plant pathogens and weed pests destroy 40 percent of the world’s potential food production every year, despite approximately 3 million tons of pesticide and various nonchemical crop controls. If that isn’t enough, more than half the world’s population is at risk from diseases carried by mosquitoes, flies and other vectors. Because blanketed pesticide application doesn’t seem to be effective, intelligent pest control systems have been proposed – but development has long been stagnant due to erroneous devices, a focus on the wrong parameters and a lack of data. Now, researchers at the University of California, Riverside, have found the sweet spot of insect classification using cheap materials to make an optical sensor that accurately identifies flying insect species with 99 percent accuracy for targeted, precise eradication. “We set out not knowing what was possible,” said Dr. Eamonn Keogh, a computer science professor at UC Riverside’s Bourns College of Engineering. “Now, the problem is essentially solved. We have created insect classification tools that can outperform the world’s top entomologists in a fraction of the time.” Graduate student Yanping Chen, Dr. Eamonn Keogh and graduate student Adene Why hold their optical sensor equipment, which can identify flying insect species with 99 percent accuracy. Courtesy of Peter Phun/University of California, Riverside. Keogh and his colleagues created wireless optical sensors made out of plastic terrariums from a local pet store. The sensors identify insects by analyzing the patterns generated when wings partially block a laser beam. Inside the shoebox-sized container is a phototransistor array, an electronic board and a laser. When an insect passes across the laser beam, its wings partially block the light, causing a small fluctuation. The fluctuations are then captured by the phototransistor array as the current changes. The signal is filtered and amplified by the electronic board, fed into a digital sound recorder and later downloaded to a computer for analysis. The team’s success lay in the unconventional use of optical sensors instead of typical acoustic ones. When listening to an insect through a microphone, sound diminishes according to an inverse square law. For example, the sound intensity of a flying insect three units away from a microphone will drop to one-ninth its original power. To make up for those losses, some researchers use unreliable tactics to increase sound levels, such as tapping or prodding insects under bright halogen lights, or even tethering them to a string to keep them close to the microphone. Keogh’s team, however, used a laser instead of a microphone to bypass those hurdles, finding a clear optical path to accurate data collection. As the researchers added more insect flight behavior patterns to their classification algorithm, they increased their success rate in classifying different species. Using only wingbeat sounds, they experienced an 88 percent success rate. When time of day was added as a variable, it jumped to 95 percent. After adding location, they increased their success rate to 97 percent. The researchers believe they can further improve that mark by adding variables such as height of flight, temperature or humidity. Frugality has always been of great importance to this project, and Keogh’s original sensor was made of Legos, a 99-cent-store laser pointer and a TV remote control. The goal of this research was to make an automated classification method as simple, inexpensive and ubiquitous as current methods of sticky traps and interception traps, but with the digital advantages of increased accuracy, real-time monitoring and the collection of flight behavior patterns. The sensors can be solar-powered and built for less than $10, Keogh believes, to aid in medical and agricultural entomology around the world. They currently are being used on a small scale in Brazil and Hawaii, with plans under way to deploy them in Mali for malaria prevention.