Search Menu
Photonics Media Photonics Buyers' Guide Photonics EDU Photonics Spectra BioPhotonics EuroPhotonics Industrial Photonics Photonics Showcase Photonics ProdSpec Photonics Handbook
More News
Email Facebook Twitter Google+ LinkedIn Comments

Tracking eyes reveals surprises about autism

May 2006
Hank Hogan,

In studying autism, researchers have struggled with a basic problem: interviewing subjects. Because the condition is characterized by difficulty in social interactions, investigators have turned to eye tracking. However, eye-tracking headgear is so obtrusive that it is tolerated only by those with mild autism, making it nearly impossible to resolve some basic questions.

Now, thanks to advances in cameras and some custom engineering, researchers at the State University of New York at Binghamton’s Institute for Child Development have overcome these issues and found some surprising results: When given tasks, autistic children pay attention, are motivated and understand what they are trying to do. “This process for our kids with autism is not impaired,” said Raymond G. Romanczyk, director of the institute.

These preliminary and unpublished results, which are to be presented at several conferences, don’t support current thinking. Romanczyk and his group are planning further studies — experiments that have long been contemplated, but that have been made possible only recently. “We’ve wanted to do this research for about 15 years and, finally, the technology caught up to what we needed,” he explained.

The setup at Binghamton consisted of commercial and custom technology. The commercial component was an eye tracker from Tobii Technology AB of Stockholm, Sweden. This stand-alone device doesn’t require contact with the subject and can track the gaze at 50 fps with an accuracy of about a half a degree. When located at a distance of 60 cm, the eye tracker can follow the subject’s eyes even when the head moves over distances of tens of centimeters. Besides following the subject’s gaze, it can measure pupil size. It employs two near-infrared 875-nm LEDs for a source and collects the reflections of that light off corneas. That information and other visual data are used in algorithms that determine where each eye is looking and the overall gaze point.

Binghamton engineer Wayne Kashinsky, who designed the setup, said that, about five years ago, higher-resolution cameras using the FireWire standard came out from a variety of vendors. These allowed companies such as Tobii Technology to work with a wider view of the face and to be farther away. The devices, said Kashinsky, have a camera that images the entire face with enough resolution to differentiate the pupils and to get the tracking information.

In the setup, the table was split into two levels, with the eye tracker in plain sight between them. Because the eye tracker worked only if the eyes were somewhere in a cube about 12.5 cm on a side, Kashinsky mounted it on a motorized stand so that the angle could be changed. He combined this with a motorized eight-way adjustable seat. The researchers could then bring children who might not be all that cooperative into the right alignment and position. What’s more, they could make minor adjustments to correct for some movement.

To gain further information, they also used wireless and thin galvanic skin conductance measurement devices that Kashinsky designed. Secured by a band in a child’s hand, the devices provided an indication of the state of arousal — or anxiety — being experienced during the study.

Using this equipment, the researchers had groups of normal and autistic children of various ages perform tasks. Some were imitations of simple play, such as picking up a toy airplane and making it fly. These were done on the table and repeated with the objects near a model’s face. Finally, the children were asked to imitate a model’s actions, such as opening the mouth or blinking the eyes. A second part of the study replaced the live model with a life-size image of the same person on a 42-in. plasma screen.

Synchronization played an important role in gathering the data. The researchers used auditory instructions that only the model could hear to initiate an action, simultaneously triggering the sensors. This ensured that the eye tracker and galvanic skin conductors captured information at the right moment.

Thanks to the eye movement and skin conductance data, the researchers knew what the children were looking at and could gauge their anxiety as they attempted various tasks. What they found, Romanczyk said, was that the school of thought that autism is grounded in a lack of attention didn’t appear to be correct. “As it turned out, the kids with autism were not looking less at the face than the typical kids, but their performance was much, much impaired,” he explained.

Researchers developed this setup to help with autism studies. It consists of a split table and an eye tracker (the black box in the middle) mounted to follow a subject’s eyes and, therefore, attention. A custom-designed galvanic skin conductance device (seen here in the open hand) measures levels of anxiety or arousal. Courtesy of Wayne Kashinsky, Raymond Romanczyk and the State University of New York at Binghamton.

This performance difference between normal and autistic subjects didn’t show up when subjects were imitating the movement of a toy. However, it grew progressively greater the closer the task got to the model’s face. But there wasn’t a spike in arousal or anxiety, indicating that the difference was not because the autistic children were too nervous to successfully complete the task. The problem, noted Romanczyk, appears to lie in the processing of the information. The disconnect doesn’t seem to be attributable to a lack of motivation or of not understanding the instructions.

Plans call for an extension of the study from largely passive imitation to what is likely to be more demanding social interaction. Romanczyk predicted that this phase will highlight differences between autistic and normal children. “Our hunch is now we may in fact see a role of arousal, a role of attention,” he said.

In studying social interactions, Kashinsky contended that continued technological advances, particularly in computer processing power, may help. Although cameras with five or more megapixels are available, current computers aren’t powerful enough to convert the data that these collect into eye-tracking information at 30 fps.

That computer limitation will change over the next few years, and eye tracking systems could then have a field of view several feet on a side. That would make it possible to image a group and tell where each person was looking.

Contact: Wayne Kashinsky, Health Sciences Center, State University of New York at Binghamton; e-mail: Nico Vroom, Tobii Technology Inc., Stockholm, Sweden; e-mail:

BiophotonicsResearch & TechnologySensors & Detectors

Terms & Conditions Privacy Policy About Us Contact Us
back to top
Facebook Twitter Instagram LinkedIn YouTube RSS
©2018 Photonics Media, 100 West St., Pittsfield, MA, 01201 USA,

Photonics Media, Laurin Publishing
x Subscribe to BioPhotonics magazine - FREE!
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.