Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Optical Lace Could Give Robots Higher Tactile Capabilities

Optical lace — lacings of stretchable optical fibers distributed throughout 3D-printed elastomer frameworks — could be used to create a linked sensory network similar to a biological nervous system that would enable soft robots to sense how they interact with their environment and adjust their motions accordingly. The synthetic material was developed at Cornell University by a research team led by Patricia Xu in the lab managed by professor Rob Shepherd.

For the optical lace, the researchers used a flexible, porous lattice structure manufactured from 3D-printed polyurethane. They threaded its core with stretchable optical fibers containing more than a dozen mechanosensors. They then attached an LED light to illuminate the fiber. When the lattice structure was pressed at various points, the sensors were able to pinpoint changes in the photon flow.

“When the structure deforms, you have contact between the input line and the output lines, and the light jumps into these output loops in the structure, so you can tell where the contact is happening,” Xu said. “The intensity of this determines the intensity of the deformation itself.”  


A flexible, porous lattice structure is threaded with stretchable optical fibers containing more than a dozen mechanosensors and attached to an LED light. When the lattice structure is pressed, the sensors pinpoint changes in the photon flow. Courtesy of Cornell University Organic Robotics Lab.

Currently, autonomous robots mainly use visual and tactile detectors and sensors to complete complex tasks. A more distributed sensor network, similar to the neural circuitry of animals, would allow robots to interact at higher tactile resolutions and to measure mechanical damage accumulated over time. The importance of distributed, volumetric sensing is even more pronounced in the field of soft robotics, where every part of the machine deforms, the researchers said.

“We want to have a way to measure stresses and strains for highly deformable objects, and we want to do it using the hardware itself, not vision,” Shepherd said. “A good way to think about it is from a biological perspective. A person can still feel their environment with their eyes closed, because they have sensors in their fingers that deform when their finger deforms. Robots can’t do that right now.”

The optical lace would not be used as a skin coating for robots, Shepherd said, but would be more like the flesh itself. Robots fitted with the optical lace would be better suited for tasks such as caregiving that require tactile sensitivity.

“The robot would need to know its own shape in order to touch and hold and assist elderly people without damaging them,” Shepherd said. “The same is true if you’re using a robot to assist in manufacturing. If they can feel what they’re touching, then that will improve their accuracy.”

While the optical lace does not have as much sensitivity as a human fingertip, it is more sensitive to touch than the human back. The researchers believe it could be used to localize deformation with submillimeter positional accuracy and sub-Newton force resolution.

The material is washable, too, which has led to another application: Shepherd’s lab has launched a startup company to commercialize Xu’s sensors to make garments that can measure a person’s shape and movements for augmented reality training.

The researchers also plan to explore the use of machine learning to detect more complex deformations, like bending and twisting. “I make models to calculate where the structure is being touched, and how much is being touched,” Xu said. “But in the future, when you have 30 times more sensors and they’re spread randomly throughout, that will be a lot harder to do. Machine learning will do it much faster.”

The research was published in Science Robotics (https://doi.org/10.1126/scirobotics.aaw6304). 

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media