Photonics Spectra BioPhotonics Vision Spectra Photonics Showcase Photonics Buyers' Guide Photonics Handbook Photonics Dictionary Newsletters Bookstore
Latest News Latest Products Features All Things Photonics Podcast
Marketplace Supplier Search Product Search Career Center
Webinars Photonics Media Virtual Events Industry Events Calendar
White Papers Videos Contribute an Article Suggest a Webinar Submit a Press Release Subscribe Advertise Become a Member


Reach Out and Get Haptic Feedback

CAMBRIDGE, Mass., & LONDON, Oct. 29 -- Researchers at MIT and University College London have linked "hands across the water" in the first transatlantic touch, literally feeling each other’s manipulations of a small box on a computer screen.

Potential applications abound. "In addition to sound and vision, virtual reality programs could include touch as well," said Mandayam A. Srinivasan, director of MIT’s Touch Lab and leader of the MIT team.

Imagine the benefits of haptic (touch) feedback for a surgeon practicing telemedicine. Or how about artists from around the world collaborating on a virtual sculpture? They could create different forms, colors, sounds and textures -- all accessible from the Internet. Students in a physics class might "feel" the forces within the nucleus of an atom.

"That application could also be sent across a very widespread network," Srinivasan said. "We really don’t know all of the potential applications, just like Bell didn’t anticipate all of the applications for the telephone."

The tactile feat was first accomplished in May. The researchers are demonstrating it again at an Internet2 conference this week at the University of Southern California. That two-part demo transmits touch signals between California and MIT, and between California and University College London (UCL).

"As far as we know, this is the first time that touch signals have been transmitted over long distances, particularly across the Atlantic," said Srinivasan. In 1998, his group transmitted touch signals between two rooms at MIT, allowing two users to perform a cooperative manipulation task in a shared virtual environment.

"Touch is the most difficult aspect of virtual environments to simulate, but we have shown in our previous work with MIT that the effort is worthwhile. Now we are extending the benefits of touch feedback to long-distance interaction," said Mel Slater, professor of virtual environments in UCL’s computer sciencedepartment and Srinivasan’s UCL counterpart.

Srinivasan and Slater’s colleagues on the work are former MIT graduate student Boon K. Tay; current MIT graduate student Jung Kim of mechanical engineering; and J. Jordan, J. Mortensen and M. Oliveira at UCL. All are authors of a paper describing an experiment that showed that people completing a collaborative long-distance computer task that included the sense of touch felt a significantly greater sense of having a partner than those without access to the touch interface. The paper was presented Oct. 9 in Porto, Portugal, at PRESENCE 2002: The 5th Annual International Workshop on Presence.

How it Works
The demonstration of long-distance touch involves a computer and a small robotic arm that takes the place of a mouse. A user can manipulate the arm by clasping its end, which resembles a thick stylus. The overall system creates the sensation of touch by exerting a precisely controlled force on the user’s fingers. The arm, known as the PHANToM, was invented by others at MIT in the early 1990s (it's distributed commercially by SensAble Technologies). The current researchers modified the PHANToM software for the transatlantic application.

On the computer screen, each user sees a 3-D room. Within that room are a black box and two tiny square pointers that show the users where they are in the room. They then use the robotic arms to collaboratively lift the box.

That’s where the touch comes in. As a user at MIT moves the arm -- and therefore the pointer -- to touch the box, he can "feel" the box, which has the texture of hard rubber. The user in London does the same thing. Together they attempt to pick up the box -- one applying force from the left, the other from the right -- and hold it as long as possible. All the while, each user can feel the other’s manipulations of the box.

An MIT writer participated in a recent demonstration. The force from the participant in London felt so real that the writer jumped backward.

What's Next?
Jung Kim, the MIT researcher who participated in the May demonstration, describes the experience as "amazing. The first touch from the other side of the world!"

There are still technical problems to be solved, however, before everyday applications will become available. Chief among them is the delay, due to Internet traffic, between when one user "touches" the on-screen box and when the second user feels the resulting force. "Each user must do the task very slowly or the synchronization is lost," Srinivasan said. In that circumstance, the box vibrates both visually and to the touch, making the task much more difficult.

Srinivasan is confident, however, that the delay can be reduced. "Even in our normal touch, there’s a time delay between when you touch something and when those signals arrive in your brain," he said. "So in a sense, the brain is teleoperating through the hand."

A one-way trip from hand to brain takes about 30 milliseconds; that same trip from MIT to London takes 150 to 200 milliseconds, depending on network traffic. "If the Internet time delays are reduced to values less than the time delay between the brain and hand, I would expect that the Internet task would feel very natural," Srinivasan said.

Although improving network speeds is the researchers’ main hurdle, they also hope to improve the robotic arm and its capabilities, as well as the algorithms that allow the user to "feel" via computer.

For more information, visit: www.ucl.ac.uk

Explore related content from Photonics Media




LATEST NEWS

Terms & Conditions Privacy Policy About Us Contact Us

©2024 Photonics Media