IKEA Furniture Assembly via Robot Vision
Scientists at Nanyang Technological University, Singapore (NTU Singapore) have developed a robot that can independently assemble the individual components of an IKEA chair without interruption in less than 10 min. The robot uses an Ensenso N35 3D camera from IDS and two robot arms equipped with grippers to pick up objects.
The robot hardware is designed to simulate how people mount objects. In this case, the eyes are replaced by a 3D camera and the arms by industrial robot arms capable of moving in six axes. Each arm is equipped with parallel grippers for picking up objects. Force sensors are attached to the wrists to determine how strongly the fingers grip and how strongly they bring objects into contact with each other.
Scientists at NTU Singapore develop robot for furniture assembly. Courtesy of IDS Imaging Development Systems GmbH.
The robot starts the assembly process by taking 3D images of the parts lying on the ground to create a map of the estimated positions of the various components. The camera works according to the projected texture stereo vision principle, which imitates human vision.
Two cameras acquire images from the same scene from two different positions. Although the cameras see the same scene content, there are different object positions according to the cameras’ projection rays. Special matching algorithms compare the two images, search for corresponding points, and visualize all point displacements in a disparity map. The Ensenso software then determines the 3D coordination for each individual image pixel or object point, in this case the chair components.
“For a robot, putting together an IKEA chair with such precision is more complex than it looks,” said professor Pham Quang Cuong of NTU. “The job of assembly, which may come naturally to humans, has to be broken down into different steps, such as identifying where the different chair parts are, the force required to grip the parts, and making sure the robotic arms move without colliding into each other.”
The NTU researchers developed algorithms that enable the robot to take the necessary steps to assemble the chair on its own, and in just 8 min and 55 sec.
Robot vision can be challenging, and in this particular instance, the challenge was to locate the components as precisely, quickly, and reliably as possible in a confusing environment. This was done by using a camera system that produces a high-contrast texture on the object surface by using a pattern mask, even under difficult light conditions. The projected texture supplemented the weak or nonexistent object surface structure found on the components of the IKEA chair.
According to Cuong, artificial intelligence will make the application even more independent and promising in the future.
“We are looking to integrate more artificial intelligence into this approach to make the robot more autonomous so it can learn the different steps of assembling a chair through human demonstration or by reading the instruction manual, or even from an image of the assembled product,” he said.
The robot developed by the scientists at NTU Singapore is used for research into clever manipulation, an area of robotics that requires precise control of the forces and movements of special robot hands or fingers.
LATEST FROM PHOTONICS MEDIA