Skip to main content

Unsupervised, Mobile and Wireless Brain–Computer Interfaces on the Horizon

Researchers are working to engineer practical devices that patients can use in their homes

Juliano Pinto, a 29-year-old paraplegic, kicked off the 2014 World Cup in São Paulo with a robotic exoskeleton suit that he wore and controlled with his mind. The event was broadcast internationally and served as a symbol of the exciting possibilities of brain-controlled machines. Over the last few decades research into brain–computer interfaces (BCIs), which allow direct communication between the brain and an external device such a computer or prosthetic, has skyrocketed. Although these new developments are exciting, there are still major hurdles to overcome before people can easily use these devices as a part of daily life.

Until now such devices have largely been proof-of-concept demonstrations of what BCIs are capable of. Currently, almost all of them require technicians to manage and include external wires that tether individuals to large computers. New research, conducted by members of the BrainGate group, a consortium that includes neuroscientists, engineers and clinicians, has made strides toward overcoming some of these obstacles. “Our team is focused on developing what we hope will be an intuitive, always-available brain–computer interface that can be used 24 hours a day, seven days a week, that works with the same amount of subconscious thought that somebody who is able-bodied might use to pick up a coffee cup or move a mouse,” says Leigh Hochberg, a neuroengineer at Brown University who was involved in the research. Researchers are opting for these devices to also be small, wireless and usable without the help of a caregiver.

Unsupervised communication
For paraplegics who lose motor and sensory function as a result of amyotrophic lateral sclerosis, spinal cord injury or stroke-induced brain damage, simply communicating can be extremely difficult, if not impossible. To help these individuals researchers have developed systems that connect a patient’s brain to a computer, allowing them to type on a virtual keyboard by using their thoughts to point and click on a screen. These are akin to the computer physicist Stephen Hawking uses—although his works by detecting cheek movements rather than via a direct connection to the brain. But devices like Hawking’s, which rely on residual muscle movement, are labor-intensive and cannot serve those who have lost all motor abilities.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


The newer, brain-controlled versions of these devices work in one of two ways; either through an electroencephalogram (EEG) cap that detects neural activity using electrodes placed on the scalp or a device planted directly into the brain. A decoder translates these neural signals into commands that move computer cursors and prosthetic limbs. BrainGate has developed a device, named after itself, which is composed of an “aspirin-sized array of electrodes” that is implanted in the motor cortex, the area of the brain primarily responsible for voluntary movement. Patients use their thoughts to move a cursor to virtually type on a screen. Researchers are continually working to improve the speed of these systems. In a study published this September in Nature Medicine the group used the BrainGate system, to achieve the highest published performance of “virtual typing” by a person to date—which translated to approximately six words per minute, still much slower than the average typing speed. (Scientific American is part of Nature Publishing Group.)

A major limitation of these devices is that the decoder requires frequent calibration—which is needed to accurately estimate an individual’s movement intention—because neural signals change over time. This could be caused by a slight movement of the electrode or outside noises, such a phone ringing or an ambulance driving by. Hochberg and his team reported in a study published this week in Science Translational Medicine that they have overcome this obstacle by creating an automatically calibrating device. “A lot of the concerns [around BCIs] had to do with the stability of the recordings, because if the decoder is calibrated to one time point, there is a natural limit of how long the decoder could be good for,” says Beata Jarosiewicz, neuroscientist at Brown and lead author of the paper. With the new and improved system, patients were able to type for several hours at a time across multiple days without the need for intervening technicians—a big step in improving usability. “Lack of [stability in the neural recordings] has been a persistent problem with BCIs, and the investigators used a set of well-considered approaches to effectively address this for cursor control,” says Andrew Schwartz, a University of Pittsburgh neuroscientist who has done extensive work on brain-controlled devices but was not involved in the study. “Furthermore, the same basic ideas can probably be applied to more elaborate control, for instance, of an arm and hand,” he adds.

Wireless and mobile
Still, these machines require large computers to run, and the protruding wires are not only prone to infection when implanted but also are impractical for patients who want to move freely. To address these issues the researchers are collaborating with another group at Brown, led by neuroengineer Arto Nurmikko, to make the devices wireless and mobile.

In most brain-controlled devices there are two cables: A short one joins the implanted device to a connector that sits atop the skull. A long cable connects the top of the head to external electronics having multiple functions that include signal decoding and sending movement commands.

In recent years, Nurmikko’s group has been developing and testing an implantable, wireless BCI in monkeys. The microelectronic device eliminates the larger cable from the top of the head—rather it is implanted under the skin and includes a tiny wireless radio. According to Nurmikko, their device can now transmit signals on the order of a 100 megabytes per second, which, he says, would be considered decent speed for a home Internet connection. This, however, is still a fraction of what the brain, which stores billions of bytes of data, is capable of transmitting. The interface is fully implanted under the scalp, and because nothing pierces the skin it greatly reduces the possibility of infection. These devices worked in monkeys, and Hochberg and Nurmikko recently received a grant to prepare them for human testing within the next two years.

Their group is also working to shrink the computers that run the system into the size of iPhone, something that is portable and possibly even wearable. It would wirelessly receive the signals from the electronic implant and do all the number crunching before sending it to an external device, such as a keyboard or robotic arm. “The vision that my colleagues and I have is eventually allowing someone who is disabled—tetraplegic for example—to not have to be in a totally restricted, supervised environment,” Nurmikko says.

Rise of the machines
The current reality, however, is that most patients will opt for nonimplantable BCIs because of the risks involved in going under the knife. Whereas EEG-based systems do not require neurosurgery because they record from the scalp rather than inside the brain, they are much less specific and robust than an implanted device. “It’s nice to see EEG and related approaches involving scalp electrodes, but if I think about getting my thoughts to operate a complex, dexterous action like playing the piano, my personal opinion is that these systems will never get you to that level of precision,” Nurmikko says.

Beyond these much-needed engineering advances, more research is required to expand the possibilities of brain-controlled machines to facilitate complex tasks and behaviors such as tool use and language production (without the use of a virtual keyboard). “To do this, we will need to change the way we have been doing neuroscience and aim at the discovery of basic principles of brain operation,” Schwartz says. “We have a long way to go.”