Close

Search

Search Menu
Photonics Media Photonics Buyers' Guide Photonics Spectra BioPhotonics EuroPhotonics Vision Spectra Photonics Showcase Photonics ProdSpec Photonics Handbook
More News

Indoor Autonomous Vehicle Navigates with Unprecedented Accuracy

Facebook Twitter LinkedIn Email Comments
Holly O’Dell, Contributing Editor

In the mid-1980s, Vivek Burhanpurkar wrote one of the first academic theses about the role of artificial intelligence (AI) in self-driving systems in complex, unmapped environments. Over the years, the founder and CEO of Cyberworks Robotics Inc. has extensively tested and evolved the technology into a mature and robust multipurpose navigation solution for third-party OEMs of manually driven or pushed equipment such as wheelchairs, floor cleaners, and indoor transport vehicles. Burhanpurkar had been using Microsoft Kinect 2, but to improve the system’s accuracy, Cyberworks needed a depth camera that was not only cost-effective and precise but also had a large field of view (FOV).

Cyberworks’ self-driving wheelchair can navigate everywhere from simple office space to complex airport layouts. Courtesy of Cyberworks.

 
  Cyberworks’ self-driving wheelchair can navigate everywhere from simple office space to complex airport layouts. Courtesy of Cyberworks.



Burhanpurkar selected the Intel RealSense camera, distributed by imaging and vision company FRAMOS Technologies Inc., to autonomously navigate complicated indoor settings. With a range of 200 mm to 3.5 m, the Intel RealSense D435 stereo depth camera uses a pair of imagers that look at the same object from slightly different perspectives. The difference in these perspectives is used to generate a depth map by calculating a numeric value for the distance from the imagers to every point in the scene.

“Measuring the size of a finger, the camera is designed for easy and immediate integration,” said Ute Häußler, corporate editor at FRAMOS.

Combined with lidar and a stereo 3D depth sensor, the autonomous system can sense and avoid 3D obstacles. But Burhanpurkar and his engineering team had to solve a couple of problems with the Intel RealSense camera, primarily the large amount of noise and oscillation in the range data. Once they overcame those challenges, the engineers were able to use the three sensors to provide a 260° FOV — including a 90° FOV from the camera.

“To have true AI-based autonomy, you need to have 3D depth sensing and visual sensing capability,” Burhanpurkar said. “Other solutions rely solely on two-dimensional lidar, or they use three-dimensional lidar that costs tens of thousands of dollars. The Intel RealSense gave us an economical way to detect obstacles in the path and then go around them in a precise manner.”

To operate its system, Cyberworks uses a single standardized software platform that allows nontechnical personnel to be trained in 30 minutes to set up and teach their machines the required navigation patterns. The company has successfully tested the technology in several complex environments.

“Our system has the ability to quickly map vast areas within a couple of hours, as compared to days required for competitive technologies,” Burhanpurkar said. “We can also operate in arbitrary and rapidly changing dynamic conditions and easily incorporate knowledge of human behaviors and experience.”

The extremely compact Intel RealSense camera offers a 90° FOV. Courtesy of Intel.

 
  The extremely compact Intel RealSense camera offers a 90° FOV. Courtesy of Intel.



At one international airport hub, where there is zero tolerance for collisions, Cyberworks’ autonomous system was installed on an indoor transport vehicle that traversed the entire 1.5-km facility with 100 percent accuracy. The vehicle mapped its surroundings, localizing itself on its own in a 3D space, and then used that information to know where it was and where it needed to go.

“If something new pops up, like a child running in front of the vehicle, the system uses its AI to know how to circumvent that obstacle and resume the path it wants to be on,” Burhanpurkar said. If the vehicle’s occupant wants to make a stop at the restroom or grab a cup of coffee, the chair defines these locations of interest, diverts to them, and then resumes the path to the gate.

Cyberworks’ autonomous wheelchair features a camera, lidar, and 3D stereo depth sensor for precise navigation. Courtesy of Cyberworks.

 
  Cyberworks’ autonomous wheelchair features a camera, lidar, and 3D stereo depth sensor for precise navigation. Courtesy of Cyberworks.



For the past year, the self-driving technology has also been installed on an industrial floor cleaner covering a 50,000-sq-ft area at IBM headquarters in Toronto. The cleaner has been able to navigate everything from long hallways to tight office spaces without any collisions.

“In floor cleaning, people can get careless with the machines, and they end up scuffing the walls or the furniture,” Burhanpurkar said. “With autonomous technology, that can be entirely eliminated.”

In addition to protecting objects, Burhanpurkar counts safety, liability, consistency, and labor cost savings among the benefits of Cyberworks’ technology. “You can consistently ensure that the job was done in the best possible way because you are able to monitor the behavior and activity of the self-driving system, where you can’t always do that when a person is driving,” he said.

Cyberworks expects the first self-driving wheelchairs to enter the commercial market in mid to late 2019 for transporting hospital patients and airport passengers. Other potential uses include manually powered equipment and medication and surgical carts in hospitals.

“There are thousands of what we call corner cases, which are situations that arise very rarely but can be problematic,” Burhanpurkar said. “It is only through decades of real-world testing and real-world experience that you can address all of these instances for OEMs of manually driven or pushed equipment.

Vision Spectra
Mar 2019
GLOSSARY
lidar
An acronym of light detection and ranging, describing systems that use a light beam in place of conventional microwave beams for atmospheric monitoring, tracking and detection functions. Ladar, an acronym of laser detection and ranging, uses laser light for detection of speed, altitude, direction and range; it is often called laser radar.
artificial intelligence
The ability of a machine to perform certain complex functions normally associated with human intelligence, such as judgment, pattern recognition, understanding, learning, planning and problem solving.
Vivek BurhanpurkarFramoslidar3D visionCyberworksIBMAIartificial intelligenceVision in Actionautonomous vehicles

Comments
Submit a Feature Article Submit a Press Release
Terms & Conditions Privacy Policy About Us Contact Us
Facebook Twitter Instagram LinkedIn YouTube RSS
©2019 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA, info@photonics.com

Photonics Media, Laurin Publishing
x We deliver – right to your inbox. Subscribe FREE to our newsletters.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.