‘Brain cap’ turns thought into motion

A ‘brain cap’ being developed at the University of Maryland allows users to turn their thoughts into motion, potentially allowing them to control computers, robotic prosthetic limbs, motorized wheelchairs and even digital avatars.

The cap uses electroencephalography (EEG) to non-invasively read brain waves and translate them into movement commands.

“We are on track to develop, test and make available to the public—within the next few years—a safe, reliable, noninvasive brain computer interface that can bring life-changing technology to millions of people whose ability to move has been diminished due to paralysis, stroke or other injury or illness,” says associate professor of Kinesiology José Contreras-Vidal.

“We are doing something that few previously thought was possible.”

Contreras-Vidal’s team successfully used EEG brain signals to reconstruct the complex 3-D movements of the ankle, knee and hip joints during human treadmill walking. In earlier studies, they’d already produced similar results for 3-D hand movement and shown that subjects wearing the brain cap could control a computer cursor with their thoughts.

The technology holds particular value because it’s non-invasive, unlike other systems which have involved planting electrodes in the brain – after all, as the researchers point out, most people don’t want holes in their heads and wires attached to their brains.

“EEG monitoring of the brain, which has a long, safe history for other applications, has been largely ignored by those working on brain-machine interfaces, because it was thought that the human skull blocked too much of the detailed information on brain activity needed to read thoughts about movement and turn those readings into movement commands for multi-functional high-degree of freedom prosthetics,” says Contreras-Vidal.

The team’s now working with researchers at other institutions to develop thought-controlled robotic prosthetics that can assist victims of injury and stroke.

“There’s nothing fictional about this,” says Rice University collaborator Marcia O’Malley. “The investigators on this grant have already demonstrated that much of this is possible. What remains is to bring all of it – non-invasive neural decoding, direct brain control and sensory feedback – together into one device.”