A force multiplier for VGRs

Facebook X LinkedIn Email
Vision-guided robots (VGRs) first emerged in the late 1980s. Since the turn of the century, however, adoption has taken off, due in large part to a decrease in the cost and to the smaller footprints of the systems, along with interest from key industries such as pharmaceuticals, food and beverage, automotive, and manufacturing. According to a recently published market forecast by Fact.MR, the robot vision market will reach $2.7 billion in the U.S. by the end of this year and $7.2 billion by the end of 2033.

Traditionally, robots have located parts or detected faulty ones by using techniques based on discrete or rules-based analysis of 2D or 3D data. This process is therefore often referred to as traditional machine vision, and while it can require significant programming and debugging by vision engineers and robot programmers, it is highly effective for many tasks.

But it does not work well in cases where the lighting is varied or when a wide variety of parts must be assessed. This is where artificial intelligence, or AI, plays an important role.

In this edition’s cover story, FANUC’s David Bruce offers insights into just how AI is changing the VGR landscape. He begins with the origins of the first artificial neural network, constructed in the mid-1950s to mimic the way the human brain works. Today, deep convolutional neural networks (DCNNs) are capable of sophisticated image analysis, weighing the thousands of different data inputs used to classify and sort objects.

Incorporating DCNNs into vision systems is now opening new use cases for vision-guided robotics, especially in logistics. Singulation, a task in which bulk mixed items are separated before being fed into an automated sorting system, is one such task — and results show that modern DCNNs can be over 99% effective at this form of segmentation. This success rate compares favorably to traditional machine vision, which comes in at about 95%.

Don’t miss Bruce’s “AI Takes Vision-Guided Robotics to the Next Level”.

I hope you enjoy this issue, and I look forward to seeing many of you at Automate in Detroit, May 22-25, which is previewed here.



Published: March 2023

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.