12.18.2006
Robot Controlled by Thought Alone Premieres
>
In the future, you may be able to control a robot with just your brain. Researcher Rajesh Rao demonstrated his brain-powered robot at the Brain-Computer Interface Conference in Whistler, Canada last week.
A human operator is able to control the robot by looking through two 'eyes' affixed to the front of the robot which display the robot's field of view on a computer monitor. A specially wired electrode cap is worn by the operator, who then sends commands to the robot to perform specific activities simply by thinking them.
Currently, only high-level simple commands are recognized; however, Rao believes that with deeper integration into the brain of the operator, increasingly complex commands will be possible.
Right now, the "thought commands" are limited to a few basic instructions. A person can instruct the robot to move forward, choose one of two available objects, pick it up, and bring it to one of two locations. Preliminary results show 94 percent accuracy in choosing the correct object.
Objects available to be picked up are seen by the robot's camera and conveyed to the user's computer screen. Each object lights up randomly. When the person looks at the object that he or she wants to pick up and sees it suddenly brighten, the brain registers surprise. The computer detects this characteristic surprised pattern of brain activity and conveys the choice back to the robot, which then proceeds to pick up the selected object. A similar procedure is used to determine the user's choice of a destination once the object has been picked up.
"One of the important things about this demonstration is that we're using a 'noisy' brain signal to control the robot," Rao says. "The technique for picking up brain signals is non-invasive, but that means we can only obtain brain signals indirectly from sensors on the surface of the head, and not where they are generated deep in the brain. As a result, the user can only generate high-level commands such as indicating which object to pick up or which location to go to, and the robot needs to be autonomous enough to be able to execute such commands."
Rao's team has plans to extend the research to use more complex objects and equip the robot with skills such as avoiding obstacles in a room. This will require more complicated commands from the "master's" brain and more autonomy on the part of the robot.
Press release: University of Washington
Labels: brain, interface, neuroscience, robot