Search Menu
Photonics Media Photonics Buyers' Guide Photonics EDU Photonics Spectra BioPhotonics EuroPhotonics Industrial Photonics Photonics Showcase Photonics ProdSpec Photonics Handbook
More News
Email Facebook Twitter Google+ LinkedIn Comments

Future Visions, Near and Far Sighted
Aug 2009
AUSTIN, Texas, Aug. 14, 2009 -- Kicked off with a bit of laser-driven music and the typical National Instruments flair, NIWeek 2009, the worldwide conference on measurement and automation held last week, had something for everybody – but the real vision news is probably a few months – or more – out.

One that front, there are several developments that bear watching. The first is from NI. In one of the keynotes, the company showed off a soon-to-be-released module that interfaces a Camera Link camera to a board bearing an FPGA, or field-programmable gate array. Because of its processing power, the FPGA can better handle the torrent of data from these cameras than can a standard computer, as demonstrated when the board, a Basler camera, and a mirror were used to more accurately track a moving laser dot.

The show floor at NIWeek 2009, the worldwide conference on measurement and automation.(Photo: National Instruments)

Vision products manager Kyle Voosen noted the speed of the processing, snapping a finger while saying, "When an image comes in, you make a decision that fast." The FPGA approach is particularly well suited to high-speed inspection or other applications where the same processing must be done in parallel on each pixel. It also can be used for image preprocessing, such as might be needed in optical coherence tomography. It's not as well suited to pattern recognition or matching. The FPGA board is available today and the interface module should be available by year's end.

Another future product – and an indication of a trend within vision to gather more data – could be found on the show floor. Industrial sensor maker Sick is teaming up with integrator Cyth Systems in the development of a 3-D camera kit, which will run using NI's graphical programming platform. The demo on the floor involved inspection of blister packs, using multiple cameras and light sources to separate them into the good, the bad and the ugly.

There could also be other vision improvements, although they're likely to be further in the future. In a speech, Purdue psychological sciences professor Greg Francis spoke about the human visual system, making the point that visual illusions are not peculiarities. Instead they result from normal operation, which serves us well most of the time.

In talking about human vision, Francis noted that the photoreceptors in the eye sit behind a network of blood vessels that block out part of a scene. The vision system eliminates those static shadows and presents us with what is a representation of the world.

"Much of perception is a construction based on available information," summed up Francis.

For machine vision developers and users, such insights might provide a way to process scenes in much the way people do. That could help produce systems that see what people see.

NI is hoping to help nurture the future scientists and engineers who will build and use such vision systems. The company, the X Prize Foundation, Google, LEGO, and Wired's GeekDad are teaming up in the Moonbots contest. In it, parent-child teams will design, program and construct robots that perform simulated lunar missions similar to those required to win the $30 million Google X Prize. Announced at NIWeek 2009, the contest will run for a few months.

A keynote by NI cofounder Jeff Kodosky illustrated why the company works to foster science, technology, engineering and math (STEM) education. NI itself needs STEM graduates for continued product research and development, but the issue is bigger than that, said Kodosky.

After listing some of the problems confronting the world, he noted that the solutions often involve technology and that, in turn, means there must be somebody to create those solutions. In speaking of scientists and engineers, Kodosky said, "We need to develop more of them."

Hank Hogan

machine vision
Interpretation of an image of an object or scene through the use of optical noncontact sensing mechanisms for the purpose of obtaining information and/or controlling machines or processes.
Contraction of "picture element." A small element of a scene, often the smallest resolvable area, in which an average brightness value is determined and used to represent that portion of the scene. Pixels are arranged in a rectangular array to form a complete image.
The processes in which luminous energy incident on the eye is perceived and evaluated.
Camera LinkcamerasCyth SystemseducationFPGAHank HoganindustrialIndustry EventsJeff KodoskyKyle Voosenmachine visionNational InstrumentsNINIWeek 2009OCToptical coherence tomographyphotoreceptorpixelrobotsSensors & DetectorsSickSTEMvisionlasers

Terms & Conditions Privacy Policy About Us Contact Us
back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2018 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA,

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.