Search Menu
Photonics Media Photonics Buyers' Guide Photonics Spectra BioPhotonics EuroPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook
More News
Please note that will be undergoing maintenance on Tuesday, 10/22/2019, from 7:00pm (EDT) to 10:00pm.

Machine vision helps create a smart dartboard

Facebook Twitter LinkedIn Email Comments
Lutz Kreutzer, MVTec Software GmbH

An interdisciplinary group of students from Technical University of Munich in Germany recently put machine vision to the test. Although experts had deemed the project as unfeasible or illusory, the students created a high-speed dartboard that moves so that the dart hits the bull’s-eye every time.


Figure 1. These camera views are from above the dart and from the sides. The red areas mark the single steps of the dart’s moving track. Yellow points mark the center of gravity’s positions, which deliver data for computing the dart’s trajectory.

It was the idea of ITQ GmbH, a consulting company based in Munich, to construct this unusual application in cooperation with the university.

The task required detecting the arrow’s trajectory in real time and precalculating its impact point. An arrow takes an average of 250 ms to reach the dartboard. To detect the arrow, find its trajectory, calculate the impact point and move the dartboard to the correct point that quickly would seem impossible. For comparison, a person attempting to do the task would not show a reaction until four darts had passed him.

The students had only 18 weeks to accomplish the feat. Besides the technical challenge, they had to determine adequate components and systems and acquire them by sponsorship. They also had to learn the necessary soft skills to work as an interdisciplinary team.

During the first project phase, three central questions had to be answered:

• How exactly — and how fast — can a dart arrow be detected by standard machine vision systems?

• How should the trajectories be designed to precisely and quickly move the dartboard?

A preliminary calculation found that detecting the arrow and positioning the dartboard each would take up to 100 ms. The remaining 50 ms had to be reserved for calculating the movement of and the communication between the components.

Image processing

It was clear that exact edge extraction of the arrow’s outline was needed, so the team used a 3-D machine vision system in conjunction with two cameras.

The group had to consider that the unit must be repeatedly built up and removed, so the system had to have a flexible assembly. To compensate for a different assembling position after each setup in a new environment, the machine vision system calibration could not be complex.

Because of the limited time frame, the team used a standard machine vision library. After several tests with other software products, it chose Halcon from MVTec, which enabled camera calibration without a lot of effort and which produced real-world coordinates. It also met requirements for robustness and speed.

Figure 2. Moving the dartboard takes a maximum of 100 ms.

The dartboard had to be moved the equivalent of the diameter of one dartboard. This required 20 g acceleration of gravity. The enormous speed demanded parallel image processing. Multithreading of the processing power was done by the software’s automatic operator parallelization, allowing the software of the dartboard unit to run on two parallel AMD Opteron processors.

Two digital Photonfocus CMOS cameras with Pentax lenses were chosen. One camera recorded the trajectory of the dart from above and the other recorded it from the side. Thus, vertical and horizontal coordinates could be computed separately (Figure 1). Both cameras transmitted 150 images per second with 1024 × 1024 pixels. The data was acquired by two Silicon Software frame grabbers. Six high-frequency floodlights were used for illumination.

Figure 3. The smart dartboard device consists of a dartboard with mechanical actuators, a computer and cameras.

After trying traditional matching for locating the dart, the students chose difference images. In this technique, the arrow’s center of area was acquired from every image. However, as a fixed point, only the center of gravity followed the trajectory that must be determined. The center of gravity could be assigned to each image from the inclination and length of the dart’s main axis, from the center of area and from its well-known relative position. Thus, the parabola of flight could be computed with sufficient accuracy, and the coordinates of the impact defined. After these computing operations, the dartboard was moved to the impact point (Figure 2). Then, the mechanical actuators, executed by the control systems, brought the center of the bull’s-eye to the flying dart.

The result is a technique that allows the dartboard to move its own diameter in any direction and that works nearly 100 percent of the time.

Contact: Lutz Kreutzer, MVTec Software GmbH, Munich, Germany; e-mail:

Photonics Spectra
Mar 2007
machine vision
Interpretation of an image of an object or scene through the use of optical noncontact sensing mechanisms for the purpose of obtaining information and/or controlling machines or processes.
Accent on ApplicationsApplicationscamerasimage processingmachine vision

back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2019 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA,

Photonics Media, Laurin Publishing
x Subscribe to Photonics Spectra magazine - FREE!
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.