Search Menu
Photonics Media Photonics Buyers' Guide Photonics EDU Photonics Spectra BioPhotonics EuroPhotonics Industrial Photonics Photonics Showcase Photonics ProdSpec Photonics Handbook
More News
Email Facebook Twitter Google+ LinkedIn Comments

Fingerprint tech gets tested

Photonics Spectra
Jul 2009
Rebecca C. Jernigan,

The characteristics that help to differentiate one set of fingerprints from another were first defined in 1892 by Sir Francis Galton. In the decades since, fingerprint identification has become a staple of forensic investigations.

Originally, it was a time-consuming process that required manual comparison of a latent print with dozens or hundreds of prints previously taken from suspects, with no guarantee of finding a match.


At left, a latent fingerprint – such as this one pulled from a nail file – can be the key to solving a mystery. Automated feature extraction could make the process of identifying such prints faster and easier. At right, this latent print, imaged at 39.37 pixels per millimeter, is similar to those NIST used to test automated feature identification technologies under its ELFT program. Images courtesy of NIST.

In recent years, however, computers have been used to determine potential matches between latent prints and those in crime databases. Human assistance still is needed to mark important features of latent prints before they are entered into the computer because, typically, prints are incomplete or are smudged or located on a textured surface. Without this input, the machines have difficulty matching the prints to their potential mates effectively. Every image requires this manual processing, so the time that technicians can spend on more difficult prints is drastically reduced.

Many companies and universities are trying to ease the workload of fingerprint technicians by developing algorithms and programs to automate the majority of the work. Before these new technologies can be considered for forensic investigations, they must be independently tested and evaluated to help potential users find the best program for their application.

This is where the National Institute of Standards and Technology (NIST) comes in. For the past two years, researchers at the institute have been working on the first two phases of the Evaluation of Latent Fingerprint Technology (ELFT) program. Beginning in 2007, investigators invited companies and universities to submit their feature-identification algorithms – whether commercially available or still in development – for examination in a study that would determine their strengths and weaknesses as well as provide a baseline for the industry. For Phase II of the program, which began in 2008, eight organizations provided technology for testing.

These eight software systems were tested with 835 latent fingerprints, all of which were compared with two galleries, or databases, of 10-print records – fingerprint sets taken by police. The first gallery contained 5000 records (50,000 fingerprints) and the second, 10,000 (100,000 fingerprints). All 835 prints had matches in both galleries. Each system processed the 10-print records and latent print images without human assistance, identifying the important features of each.

Once the features had been identified, each algorithm compared the latent prints against the galleries, finding possible matches and ranking the top 50 potential matches from most to least likely. Each system was then scored on accuracy (how often it suggested a correct match and how high the match was ranked). The tests enabled the researchers to investigate factors that could affect the performance of the technology. The criteria included gallery size, image resolution, finger position and the availability of a supplementary region of interest – the area of a smudged print most likely to provide useful minutiae.

The investigators determined that half the prototypes tested found fingerprint matches about 80 percent of the time, while one was more than 95 percent accurate. There was a slight decrease in the identification rate when the gallery size was increased from 5000 to 10,000 records. The resolution of the images was not a statistically significant factor, and selecting a region of interest proved to be of limited use with some algorithms. The higher the quality of the image and the more features the algorithm could truly detect, the better the identification rate.

Although these programs are still being tested and improved, their potential is clear. By reducing the amount of tedious, routine precomputer work required for fingerprint matching, these algorithms have the potential to enhance the quality and speed of print identification.

forensic investigationsimagesNational Institute of Standards and TechnologyResearch & TechnologyTech Pulse

Terms & Conditions Privacy Policy About Us Contact Us
back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2018 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA,

Photonics Media, Laurin Publishing
x Subscribe to Photonics Spectra magazine - FREE!
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.