Disabled patients mind-meld with telepresence robots

[From Science]

[Image: Deus in machina. A semiautonomous robot can be controlled with the brain waves of paralyzed patients. Credit: José del R. Millán]

Disabled Patients Mind-Meld With Robots

by Sara Reardon on 6 September 2011

They’re not quite psychic yet, but machines are getting better at reading your mind. Researchers have invented a new, noninvasive method for recording patterns of brain activity and using them to steer a robot. Scientists hope the technology will give “locked in” patients—those too disabled to communicate with the outside world—the ability to interact with others and even give the illusion of being physically present, or “telepresent,” with friends and family.

Previous brain-machine interface systems have made it possible for people to control robots, cursors, or prosthetics with conscious thought, but they often take a lot of effort and concentration, says José del R. Millán, a biomedical engineer at the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, who develops brain-machine interface systems that don’t need to be implanted into the brain. Millán’s goal is to make control as easy as driving a car on a highway. A partially autonomous robot would allow a user to stop concentrating on tasks that he or she would normally do subconsciously, such as following a person or avoiding running into walls. But if the robot encounters an unexpected event and needs to make a split-second decision, the user’s thoughts can override the robot’s artificial intelligence.

To test their technology, Millán and colleagues created a telepresent robot by modifying a commercially available bot called Robotino. The robot looks a bit like a platform on three wheels, and it can avoid obstacles on its own using infrared sensors. On top of the robot, the researchers placed a laptop running Skype, a voice and video Internet chat system, over a wireless Internet connection. This allowed the human controller to see where the robot was going, and, because the laptop also showed a video of the user, it allowed others to interact with the user as though the user were actually there. The user also wore a cap of tiny electroencephalogram (EEG) electrodes, which measured his or her brain activity. This system translates the EEG signals into navigation instructions and transmits them in real time to the robot.

EEG patterns for movement and navigation are similar from person to person, and Millán’s group has previously demonstrated that after a little practice, a healthy person can share control with the robot with very little effort. But would a bed-bound patient, who hasn’t used his limbs for years, have the same pattern of brain waves and be able to control robots as effectively?

The researchers recruited two patients whose lower bodies were paralyzed and who had been bed bound for 6 or 7 years. The researchers trained the patients to control the robot for 1 hour per week for 6 weeks. With the instructions being transmitted over a wireless connection, the patients didn’t need to leave the hospital and were able to control the robot in Millán’s lab at EPFL, 100 kilometers away. At the end of the training period, the researchers instructed the subjects to drive the robot to various targets, such as furniture, people, and small objects, around the lab for 12 minutes.

The disabled patients performed just as well as healthy subjects, Millán and colleagues report this week at the IEEE Engineering in Medicine and Biology Society conference in Boston. When the researchers turned the shared control off, forcing the subjects to constantly concentrate on controlling the robot, the subjects took considerably longer to navigate the maze than when they shared control.

Millán says he wasn’t terribly surprised that disabled people could control the robot, as previous research using brain scans has shown that even patients who have been paralyzed since birth can still imagine moving their limbs. But he was surprised how fast they learned. He is now hoping to involve more bed-bound patients, including locked-in patients in the study. He also sees future applications for the shared control brain-machine interface, such as modifying it to let a user control a prosthetic limb or a wheelchair. And the researchers may eventually add an arm to the current telepresent robot to allow it to grasp objects.

Neuroengineer José Carmena of the University of California, Berkeley, says Millán’s approach “has a lot of novelty” in how it integrates both natural and artificial systems. There are some disadvantages, he says, to a system that uses a cap instead of a device implanted directly in the brain, such as the background signals that the cap might pick up. But for this application, he says, it is “an interesting avenue for telepresence.”

Millán says that the bed-bound patients were thrilled to participate in the study. “This opens a new possibility for families,” he says, who could interact with their bed-bound loved ones over a video connection without having to sit in front of a computer. But would the families of the disabled patients be creeped out by a robot following them around in the kitchen while they make dinner? “Well, that’s something we’ll have to ask,” he says.


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News: