Skip to main content

Wristband Lets the Brain Control a Computer with a Thought and a Twitch

An electronic bracelet is being readied for mental control of computers, prosthetics and other devices—all without the need to drill a hole in your head

Thomas Reardon, chief executive of CRTL-Labs, holds an anatomical model of the human arm at the company’s midtown Manhattan headquarters.

Credit:

R. Douglas Fields

Every so often a news article appears that shows a disabled person directing movement of a computer cursor or a prosthetic hand with thought alone. But why would anyone choose to have a hole drilled through his or her skull to embed a computer chip in the brain unless warranted by a severe medical condition?

A more practical solution may now be here that lets you hook up your brain to the outside world. CTRL–Labs, a start-up launched by the creator of Microsoft Internet Explorer, Thomas Reardon, and his partners, has demonstrated a novel approach for a brain–computer interface (BCI) that ties an apparatus strapped to your wrist to signals in the nervous system.

Physiologically, Reardon observes, all transfer of information among humans is carried out via fine motor control. Motor control of the tongue and throat gives us speech. Facial expression and body posture convey emotion and intention. Writing takes place by controlling fingers that scrape chalk on a blackboard, stroke paint, manipulate pen or pencil, or punch keys. If everything the brain does to interact with the world involves muscles, why not use the motor system to more directly interface mind and machine?


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


An answer to that question can be found nine stories up in Midtown Manhattan, where elevator doors open onto the offices of CTRL–Labs, a jumble of lab benches and computer screens occupied by a highly energized group that is engaged in manipulating robot arms and soldering circuit boards. Off-road bikes lean against the wall and on a table directly in front of the elevator lies a copy of the iconic textbook Principles of Neural Science.

Josh Duyan, the company’s chief strategy officer, emerges to greet me. Duyan doesn’t waste any time. He slips something onto his wrist that resembles a spiked choker that might be worn by a Goth punker, except  the golden square studs are on the inside of the band, pressed against his skin. A ribbon cable drapes from the wristband and extends to a laptop that shows a white mannequinlike hand spreads her fingers on its screen. The hand waves, bends her wrist and makes a fist, all mimicking Duyan’s own hand movements, detected through the wristband.

The band does not sense neural impulses. Instead, the it picks up voltage bursts that result from muscle fibers in the arm when they contract, much like an electrocardiogram detects electrical potentials of contracting cardiac muscle. The computer analyzes the electrical discharges from arm and hand muscles and uses them to calculate the motion and gripping force of the hand. The computer then initiates the same motions in the virtual hand. The movements of the limb on the computer screen could just as easily be carried out by one of the real black-and-chrome robotic arms lying about like pieces of an Iron Man suit.

Josh Duyan, CRTL-Labs' chief strategy officer, moves a virtual hand using a brain-computer interface that senses electrical activity in his forearm. Credit: R. Douglas Fields

I next watch a video demonstration. In it Reardon taps his fingers on a blank tabletop, and a string of letters appear on the computer screen at a pace of 20 words a minute. A wearer of the wristband can, in principle, dance fingers in the air or even twitch them in one’s pockets to create a message. “Your kids will not type,” Reardon proclaims about the fate of conventional keyboards. Similarly, using this device, a robotic arm could be manipulated to do things your own arm could not—snaking an endoscope through the aorta to do surgery on a faulty heart valve beating inside a patient’s chest, perhaps.

Currently, the most advanced prosthetic limbs such as the FDA-approved DEKA arm use a combination of switches activated by muscle twitches and in some cases supplemented by sensors to detect the voltage bursts produced by a muscle contracting (EMG, or electromyography). But CTRL–Labs claims it has advanced this capability to detect clusters of individual muscle cells contracting. This achievement opens and expands the bandwidth of the brain’s output. It allows manipulation of electronic devices as well with thoughts using the company’s most advanced technology.

A quick primer on the neurophysiology of muscle control is needed to understand how this BCI system can be operated with thoughts alone. The key concept is understanding what a “motor unit” is. The electrical signals picked up on the skin represent a cacophony of thousands of individual muscle fibers contracting at the same time in the forearm. In recent years neurophysiologists have developed ways to use mathematical methods to isolate the firing of individual muscle fibers from among the thousands that are activated when a muscle contracts. The bicep has 580,000 muscle fibers in it (pdf), all straining to do one thing—pull the forearm bones toward your body. This arm motion is modulated continuously on a millisecond scale by regulating hundreds of small clusters of muscle fibers individually to determine how fast the arm moves and with what force. This level of fine motor control is why the stunted motions of a mechanical device are a caricature of the graceful fluid movements of any animal.

There is no need, though, to have each of the half million muscle fibers in your biceps controlled by a separate motor neuron. In fact, there are 774 motor axons—the long fibers stretching out from neurons in the spinal cord—that control all the muscle fibers in our biceps. The axon from one spinal cord motor neuron branches and makes a small cluster of muscle fibers contract together. This arrangement—muscle fibers controlled by a single neuron—is called a “motor unit,” and the BCI device can detect a single motor unit firing. This capability means that rather than the 774 motor units in our biceps performing only one action—pulling back the forearm, for instance—each unit can in principle carry out a separate task.

Consider how a pianist communicates the music in his mind to a listener’s mind by tapping keys on a piano. With this BCI wristband, a pianist could do the same with just a picture of a keyboard, imagining himself playing “Chopsticks” or Chopin. It is not necessary for muscle to move bone and forcefully depress a wooden lever, a piano key. An imperceptible twitching of a microscopic motor unit suffices to create an electrical discharge that is easily detected by this brain–computer interface. Now, rather than being limited to the five digits that evolution has given us, a pianist could playing by issuing mental instructions to 12 virtual fingers, one for each note on the chromatic scale (all the white and black keys in an octave). Now the frenzied chords and blizzard of arpeggios in Chopin’s Polonaise are a piece of cake to play. Of course, the pianist would have to learn to activate the muscle fibers in the hand in a new way, but learning to control 12 digits would not be that different from learning to control five. Think of spiders or millipedes, neither of which trip over themselves, despite being equipped only with bug brains.

Previously, researchers have been able to detect up to 25 individual motor units. Reardon claims to have greatly surpassed that, recording about 100 individual units, “but we expect to blow through that number shortly,” he says. “That’s a lot of fingers,” he notes.

Jose Contreras-Vidal, director of the Noninvasive Brain–Machine Interface System Laboratory at the University of Houston says, if CTRL–Labs has achieved control of hundreds of individual motor units in real time, this would mark a significant achievement. “The challenge will be how to map the claimed hundreds of individual motor units to control a machine,” he adds. “I think it would be very challenging not to require movement.” Indeed, it was not known if it is possible for the brain to control its motor units independently at all, but that is what CTRL–Labs says it has discovered.

Neurofeedback is used to train the brain and computer to interface with each other. The process is as effortless as playing a video game. Duyan just imagines trying to point the computer mannequin’s index finger, for example, without actually pointing his own hand. What is happening inside his motor cortex is beyond his perception. But when the mannequin does point her finger, his brain learned to associate that motor unit firing to this hand motion. “You are training the computer, too,” Reardon says. Eventually skill in moving the hand develops unconsciously and automatically.

The device appears to be at an early stage of development, so I was not permitted to give the wristband a test drive. Learning to use the device requires practice—a process of neurofeedback in which both the computer and the user’s brain make slight adjustments to each other. And with practice, a lot happens. Patrick Kaifosh, a company co-founder who is a skilled expert, sits down, straps on the device and begins playing a game of Asteroids on his cell phone—flying his spaceship around the screen, shooting down invaders with his lasers and dodging incoming missiles from aliens. All the while his palm rests motionless on the table, with only the slightest twitching of skin here and there as he carries on a conversation with me.

“What does it feel like?” I ask.

Perplexed by the question he answers with his own query: “What does it feel like for you to think about moving your fingers?”

Indeed, I am interfacing effortlessly now with my computer through my fingertips, automatically pressing 26 keys of the alphabet as I transmit my thoughts to you. What if instead of only some two dozen keys, I had hundreds of them, all controlled automatically by my brain?

R. Douglas Fields is a senior investigator at the National Institutes of Health’s Section on Nervous System Development and Plasticity. He is author of Electric Brain: How the New Science of Brainwaves Reads Minds, Tells Us How We Learn, and Helps Us Change for the Better (BenBella Books, 2020).

More by R. Douglas Fields