Caren B. Les, firstname.lastname@example.org
Ear buds fitted with infrared sensors and connected to a microcomputer that can control electronic devices could offer hands-free remote control for changing the music on an MP3 player, for turning room lights on and off, for starting up a washing machine or for some other everyday activity.
One such gadget, invented by Kazuhiro Taniguchi, a researcher at the University of Tokyo’s School of Engineering, could have more practical and serious applications as well, including helping the physically disabled remotely operate electronic apparatuses – cameras, computers, air conditioners – or monitoring the elderly for signs of declining health or medical emergencies.
“I was inspired by iPod, YouTube, Nintendo Wii and Windows ME to call my invention the ‘Me-Me Switch,’” Taniguchi said. His device is designed, in a sense, to mimic the facial movements of the wearer. The sensors measure tiny involuntary movements inside the ear that result from intentional facial movements such as a wink, a smile or a raised eyebrow.
The machine can be programmed to run with a variety of facial expressions, he said. With a twitch of your mouth, for example, you could start the music on your iPod. With its capacity to monitor the natural movements of the face in everyday life and accumulate the data, the device could “know” a person to such an extent that it could operate automatically to change to more cheerful music if there is an indication of sadness, he added.
The Me-Me Switch generates signals for controlling the electronic devices, or subject machines, corresponding to intentional movements of the face – biocommands – by first sensing the resulting ear movements with an optical distance sensor and then processing the sensing signals with a single-chip microcomputer. Generated signals are converted by amplifier circuits and other devices into signals suitable for the electrical characteristics of the subject machine.
The microcomputer in his gadget incorporates an analog-to-digital converter with a sampling frequency of 1 MHz and 10 bits. The 5-VDC power to actuate the switch is supplied from the subject machine. The optical distance sensor outputs 0 to 5 V, corresponding to distances from ear skin to the sensor. Because the switch is in the ear, the optical sensor is unaffected by sunlight.
The switch carries out machine control by reacting only to the intentional actions of the user. It does not react to daily actions other than these biocommands, Taniguchi said. During experimentation, the subject machine operated only when biocommands were applied, he noted.
“Currently, the Me-Me Switch has some problems that need to be solved,” Taniguchi said, adding that, “the success, or recognition, ratio of the biocommands is as low as 70 percent.” This ratio represents the number of times that a user’s biocommands succeeded in controlling the machine.
He and his fellow researchers plan to develop an algorithm with a higher recognition ratio of biocommands to meet the needs of a broad range of people, on bases such as analyses of the switch’s application environments, and sampling data on ear movements of various potential user groups, such as younger and older people.