Two monkeys have been taught to use brain activity alone to move an avatar hand and identify the texture of virtual objects.
The research, from a team at the Duke University Center for Neuroengineering, could ultimately help paralysed people regain not only movement but also feeling.
“Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton,” says professor Miguel Nicolelis.
Without moving any part of their real bodies, the monkeys were able to direct the virtual hands of an avatar to the surface of virtual objects and then differentiate their textures – by the power of thought alone.
Monkeys were first taught to use a joystic to move a virtual arm on a computer screen over three different images. Each caused the joystick to vibrate in a slightly different way, to convey different textures – and one triggered a reward.
Next, the monkeys carried out the same task but without using their hands; instead controlling the cursor via a brain implant. The texture of the virtual objects was expressed as a pattern of minute electrical signals transmitted to the monkeys’ brains.
And it took one of the monkeys only four attempts and the other nine, before they learned how to select the correct object during each trial.
“This is the first demonstration of a brain-machine-brain interface that establishes a direct, bidirectional link between a brain and a virtual body,” says Nicolelis.
“We hope that in the next few years this technology could help to restore a more autonomous life to many patients who are currently locked in without being able to move or experience any tactile sensation of the surrounding world.” Nicolelis said.
It took one monkey only four attempts and another nine attempts before they learned how to select the correct object during each trial.
The findings raise hopes that it may be possible to create a robotic exoskeleton for severely paralyzed patients that would allow them to explore and receive feedback from the outside world.
The exoskeleton would be directly controlled by the patient’s voluntary brain activity, with sensors distributed across the exoskeleton generating the type of tactile feedback needed to identify the texture, shape and temperature of objects, as well as the surface they’re walking on.
The team is working with the Walk Again Project, an international, non-profit consortium, established by a team of Brazilian, American, Swiss and German scientists, and hopes to carry out a dramatic demonstration in 2014.
The plan is for two quadriplegic teenagers to accompany the Brazilian soccer team onto the pitch at the 2014 FIFA Soccer World Cup – and then kick the ball.