In a Stanford-led research report, three participants with movement impairment controlled an onscreen cursor simply by imagining their own hand movements.
A clinical research publication led by Stanford University investigators has demonstrated that a brain-to-computer hookup can enable people with paralysis to type via direct brain control at the highest speeds and accuracy levels reported to date.
The report involved three study participants with severe limb weakness — two from amyotrophic lateral sclerosis, also called Lou Gehrig’s disease, and one from a spinal cord injury. They each had one or two baby-aspirin-sized electrode arrays placed in their brains to record signals from the motor cortex, a region controlling muscle movement. These signals were transmitted to a computer via a cable and translated by algorithms into point-and-click commands guiding a cursor to characters on an onscreen keyboard.
Each participant, after minimal training, mastered the technique sufficiently to outperform the results of any previous test of brain-computer interfaces, or BCIs, for enhancing communication by people with similarly impaired movement. Notably, the study participants achieved these typing rates without the use of automatic word-completion assistance common in electronic keyboarding applications nowadays, which likely would have boosted their performance.
One participant, Dennis Degray of Menlo Park, California, was able to type 39 correct characters per minute, equivalent to about eight words per minute.
‘A major milestone’
This point-and-click approach could be applied to a variety of computing devices, including smartphones and tablets, without substantial modifications, the Stanford researchers said.
“Our study’s success marks a major milestone on the road to improving quality of life for people with paralysis,” said Jaimie Henderson, MD, professor of neurosurgery, who performed two of the three device-implantation procedures at Stanford Hospital. The third took place at Massachusetts General Hospital.
Henderson and Krishna Shenoy, PhD, professor of electrical engineering, are co-senior authors of the study, which was published online Feb. 21 in eLife. The lead authors are former postdoctoral scholar Chethan Pandarinath, PhD, and postdoctoral scholar Paul Nuyujukian, MD, PhD, both of whom spent well over two years working full time on the project at Stanford.
“This study reports the highest speed and accuracy, by a factor of three, over what’s been shown before,” said Shenoy, a Howard Hughes Medical Institute investigator who’s been pursuing BCI development for 15 years and working with Henderson since 2009. “We’re approaching the speed at which you can type text on your cellphone.”
“The performance is really exciting,” said Pandarinath, who now has a joint appointment at Emory University and the Georgia Institute of Technology as an assistant professor of biomedical engineering. “We’re achieving communication rates that many people with arm and hand paralysis would find useful. That’s a critical step for making devices that could be suitable for real-world use.”
Shenoy’s lab pioneered the algorithms used to decode the complex volleys of electrical signals fired by nerve cells in the motor cortex, the brain’s command center for movement, and convert them in real time into actions ordinarily executed by spinal cord and muscles.
“These high-performing BCI algorithms’ use in human clinical trials demonstrates the potential for this class of technology to restore communication to people with paralysis,” said Nuyujukian.
Millions of people with paralysis reside in the United States. Sometimes their paralysis comes gradually, as occurs in ALS. Sometimes it arrives suddenly, as in Degray’s case.
Now 64, Degray became quadriplegic on Oct. 10, 2007, when he fell and sustained a life-changing spinal-cord injury. “I was taking out the trash in the rain,” he said. Holding the garbage in one hand and the recycling in the other, he slipped on the grass and landed on his chin. The impact spared his brain but severely injured his spine, cutting off all communication between his brain and musculature from the head down.
“I’ve got nothing going on below the collarbones,” he said.
Degray received two device implants at Henderson’s hands in August 2016. In several ensuing research sessions, he and the other two study participants, who underwent similar surgeries, were encouraged to attempt or visualize patterns of desired arm, hand and finger movements. Resulting neural signals from the motor cortex were electronically extracted by the embedded recording devices, transmitted to a computer and translated by Shenoy’s algorithms into commands directing a cursor on an onscreen keyboard to participant-specified characters.
The researchers gauged the speeds at which the patients were able to correctly copy phrases and sentences — for example, “The quick brown fox jumped over the lazy dog.” Average rates were 7.8 words per minute for Degray and 6.3 and 2.7 words per minute, respectively, for the other two participants.
A tiny silicon chip
The investigational system used in the study, an intracortical brain-computer interface called the BrainGate Neural Interface System*, represents the newest generation of BCIs. Previous generations picked up signals first via electrical leads placed on the scalp, then by being surgically positioned at the brain’s surface beneath the skull.
An intracortical BCI uses a tiny silicon chip, just over one-sixth of an inch square, from which protrude 100 electrodes that penetrate the brain to about the thickness of a quarter and tap into the electrical activity of individual nerve cells in the motor cortex.
This is like one of the coolest video games I’ve ever gotten to play with.
Henderson likened the resulting improved resolution of neural sensing, compared with that of older-generation BCIs, to that of handing out applause meters to individual members of a studio audience rather than just stationing them on the ceiling, “so you can tell just how hard and how fast each person in the audience is clapping.”
Shenoy said the day will come — closer to five than 10 years from now, he predicted —when a self-calibrating, fully implanted wireless system can be used without caregiver assistance, has no cosmetic impact and can be used around the clock.
“I don’t see any insurmountable challenges.” he said. “We know the steps we have to take to get there.”
Degray, who continues to participate actively in the research, knew how to type before his accident but was no expert at it. He described his newly revealed prowess in the language of a video game aficionado.
“This is like one of the coolest video games I’ve ever gotten to play with,” he said. “And I don’t even have to put a quarter in it.”
The study’s results are the culmination of a long-running collaboration between Henderson and Shenoy and a multi-institutional consortium called BrainGate. Leigh Hochberg, MD, PhD, a neurologist and neuroscientist at Massachusetts General Hospital, Brown University and the VA Rehabilitation Research and Development Center for Neurorestoration and Neurotechnology in Providence, Rhode Island, directs the pilot clinical trial of the BrainGate system and is a study co-author.
“This incredible collaboration continues to break new ground in developing powerful, intuitive, flexible neural interfaces that we all hope will one day restore communication, mobility and independence for people with neurologic disease or injury,” said Hochberg.
A computer interface that can decipher the thoughts of people who are unable to communicate could revolutionize the lives of those living with completely locked-in syndrome, according to a new paper publishing January 31st, 2017 in PLOS Biology.
Counter to expectations, the participants in the study reported being “happy”, despite their extreme condition. The research was conducted by a multinational team, led by Professor Niels Birbaumer, at the Wyss Center for Bio and Neuroengineering in Geneva, Switzerland.
Patients suffering from complete paralysis, but with preserved awareness, cognition, and eye movements and blinking are classified as having locked-in syndrome. If eye movements are also lost, the condition is referred to as completely locked-in syndrome.
In the trial, patients with completely locked-in syndrome were able to respond “yes” or “no” to spoken questions, by thinking the answers. A non-invasive brain-computer interface detected their responses by measuring changes in blood oxygen levels in the brain.
The results overturn previous theories that postulate that people with completely locked-in syndrome lack the goal-directed thinking necessary to use a brain-computer interface and are, therefore, incapable of communication.
Extensive investigations were carried out in four patients with ALS (amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease) –a progressive motor neuron disease that leads to complete destruction of the part of the nervous system responsible for movement.
The researchers asked personal questions with known answers and open questions that needed “yes” or “no” answers including: “Your husband’s name is Joachim?” and “Are you happy?”. They found the questions elicited correct responses in seventy percent of the trials.
Professor Birbaumer said: “The striking results overturn my own theory that people with completely locked-in syndrome are not capable of communication. We found that all four patients we tested were able to answer the personal questions we asked them, using their thoughts alone. If we can replicate this study in more patients, I believe we could restore useful communication in completely locked-in states for people with motor neuron diseases.”
The question “Are you happy?” resulted in a consistent “yes” response from the four people, repeated over weeks of questioning.
Professor Birbaumer added: “We were initially surprised at the positive responses when we questioned the four completely locked-in patients about their quality of life. All four had accepted artificial ventilation in order to sustain their life, when breathing became impossible; thus, in a sense, they had already chosen to live. What we observed was that as long as they received satisfactory care at home, they found their quality of life acceptable. It is for this reason, if we could make this technique widely clinically available, it could have a huge impact on the day-to-day life of people with completely locked-in syndrome”.
In one case, a family requested that the researchers asked one of the participants whether he would agree for his daughter to marry her boyfriend ‘Mario’. The answer was “no”, nine times out of ten.
Professor John Donoghue, Director of the Wyss Center, said: “Restoring communication for completely locked-in patients is a crucial first step in the challenge to regain movement. The Wyss Center plans to build on the results of this study to develop clinically useful technology that will be available to people with paralysis resulting from ALS, stroke, or spinal cord injury. The technology used in the study also has broader applications that we believe could be further developed to treat and monitor people with a wide range of neuro-disorders.”
The brain-computer interface in the study used near-infrared spectroscopy combined with electroencephalography (EEG) to measure blood oxygenation and electrical activity in the brain. While other brain-computer interfaces have previously enabled some paralyzed patients to communicate, near-infrared spectroscopy is, so far, the only successful approach to restore communication to patients suffering from completely locked-in syndrome.
In the UMC Utrecht a brain implant has been placed in a patient enabling her to operate a speech computer with her mind.
The researchers and the patient worked intensively to get the settings right. She can now communicate at home with her family and caregivers via the implant. That a patient can use this technique at home is unique in the world. This research was published in the New England Journal of Medicine.
Because she suffers from ALS disease, the patient is no longer able to move and speak. Doctors placed electrodes in her brain, and the electrodes pick up brain activity. This enables her to wirelessly control a speech computer that she now uses at home.
“This is a major breakthrough in achieving autonomous communication among severely paralyzed patients whose paralysis is caused by either ALS, a cerebral hemorrhage or trauma,” says Professor Nick Ramsey, professor of cognitive neuroscience at the University Medical Center (UMC) Utrecht. “In effect, this patient has had a kind of remote control placed in her head, which enables her to operate a speech computer without the use of her muscles.”
The patient operates the speech computer by moving her fingers in her mind. This changes the brain signal under the electrodes. That change is converted into a mouse click. On a screen in front of her she can see the alphabet, plus some additional functions such as deleting a letter or word and selecting words based on the letters she has already spelled. The letters on the screen light up one by one. She selects a letter by influencing the mouse click at the right moment with her brain. That way she can compose words, letter by letter, which are then spoken by the speech computer. This technique is comparable to actuating a speech computer via a push-button (with a muscle that can still function, for example, in the neck or hand). So now, if a patient lacks muscle activity, a brain signal can be used instead.
The patient underwent surgery during which electrodes were placed on her brain through tiny holes in her skull. A small transmitter was then placed in her body below her collarbone. This transmitter receives the signals from the electrodes via subcutaneous wires, amplifies them and transmits them wirelessly. The mouse click is calculated from these signals, actuating the speech computer. The patient is closely supervised. Shortly after the operation, she started on a journey of discovery together with the researchers to find the right settings for the device and the perfect way to get her brain activity under control. It started with a “simple” game to practice the art of clicking. Once she mastered clicking, she focused on the speech computer. She can now use the speech computer without the help of the research team.
The UMC Utrecht Brain Center has spent many years researching the possibility of controlling a computer by means of electrodes that capture brain activity. Working with a speech computer driven by brain signals measured with a bathing cap with electrodes has long been tested in various research laboratories. That a patient can use the technique at home, through invisible, implanted electrodes, is unique in the world.
If the implant proves to work well in three people, the researchers hope to launch a larger, international trial. Ramsey: “We hope that these results will stimulate research into more advanced implants, so that some day not only people with communication problems, but also people with paraplegia, for example, can be helped.”
Recent research shows Brain-to-text device capable of decoding speech from brain signals
Ever wonder what it would be like if a device could decode your thoughts into actual speech or written words? While this might enhance the capabilities of already existing speech interfaces with devices, it could be a potential game-changer for those with speech pathologies, and even more so for “locked-in” patients who lack any speech or motor function.
“So instead of saying ‘Siri, what is the weather like today’ or ‘Ok Google, where can I go for lunch?’ I just imagine saying these things,” explains Christian Herff, author of a review recently published in the journal Frontiers in Human Neuroscience.
While reading one’s thoughts might still belong to the realms of science fiction, scientists are already decoding speech from signals generated in our brains when we speak or listen to speech.
In their review, Herff and co-author, Dr. Tanja Schultz, compare the pros and cons of using various brain imaging techniques to capture neural signals from the brain and then decode them to text.
The technologies include functional MRI and near infrared imaging that can detect neural signals based on metabolic activity of neurons, to methods such as EEG and magnetoencephalography (MEG) that can detect electromagnetic activity of neurons responding to speech. One method in particular, called electrocorticography or ECoG, showed promise in Herff’s study.
This study presents the Brain-to-text system in which epilepsy patients who already had electrode grids implanted for treatment of their condition participated. They read out texts presented on a screen in front of them while their brain activity was recorded. This formed the basis of a database of patterns of neural signals that could now be matched to speech elements or “phones”.
When the researchers also included language and dictionary models in their algorithms, they were able to decode neural signals to text with a high degree of accuracy. “For the first time, we could show that brain activity can be decoded specifically enough to use ASR technology on brain signals,” says Herff. “However, the current need for implanted electrodes renders it far from usable in day-to-day life.”
So, where does the field go from here to a functioning thought detection device? “A first milestone would be to actually decode imagined phrases from brain activity, but a lot of technical issues need to be solved for that,” concedes Herff.
Imagine being in an accident that leaves you unable to feel any sensation in your arms and fingers. Now imagine regaining that sensation, a decade later, through a mind-controlled robotic arm that is directly connected to your brain.
That is what 28-year-old Nathan Copeland experienced after he came out of brain surgery and was connected to the Brain Computer Interface (BCI), developed by researchers at the University of Pittsburgh and UPMC. In a study published online today in Science Translational Medicine, a team of experts led by Robert Gaunt, Ph.D., assistant professor of physical medicine and rehabilitation at Pitt, demonstrated for the first time ever in humans a technology that allows Mr. Copeland to experience the sensation of touch through a robotic arm that he controls with his brain.
“The most important result in this study is that microstimulation of sensory cortex can elicit natural sensation instead of tingling,” said study co-author Andrew B. Schwartz, Ph.D., distinguished professor of neurobiology and chair in systems neuroscience, Pitt School of Medicine, and a member of the University of Pittsburgh Brain Institute. “This stimulation is safe, and the evoked sensations are stable over months. There is still a lot of research that needs to be carried out to better understand the stimulation patterns needed to help patients make better movements.”
This is not the Pitt-UPMC team’s first attempt at a BCI. Four years ago, study co-author Jennifer Collinger, Ph.D., assistant professor, Pitt’s Department of Physical Medicine and Rehabilitation, and research scientist for the VA Pittsburgh Healthcare System, and the team demonstrated a BCI that helped Jan Scheuermann, who has quadriplegia caused by a degenerative disease. The video of Scheuermann feeding herself chocolate using the mind-controlled robotic arm was seen around the world. Before that, Tim Hemmes, paralyzed in a motorcycle accident, reached out to touch hands with his girlfriend.
But the way our arms naturally move and interact with the environment around us is due to more than just thinking and moving the right muscles. We are able to differentiate between a piece of cake and a soda can through touch, picking up the cake more gently than the can. The constant feedback we receive from the sense of touch is of paramount importance as it tells the brain where to move and by how much.
For Dr. Gaunt and the rest of the research team, that was the next step for the BCI. As they were looking for the right candidate, they developed and refined their system such that inputs from the robotic arm are transmitted through a microelectrode array implanted in the brain where the neurons that control hand movement and touch are located. The microelectrode array and its control system, which were developed by Blackrock Microsystems, along with the robotic arm, which was built by Johns Hopkins University’s Applied Physics Lab, formed all the pieces of the puzzle.
In the winter of 2004, Mr. Copeland, who lives in western Pennsylvania, was driving at night in rainy weather when he was in a car accident that snapped his neck and injured his spinal cord, leaving him with quadriplegia from the upper chest down, unable to feel or move his lower arms and legs, and needing assistance with all his daily activities. He was 18 and in his freshman year of college pursuing a degree in nanofabrication, following a high school spent in advanced science courses.
He tried to continue his studies, but health problems forced him to put his degree on hold. He kept busy by going to concerts and volunteering for the Pittsburgh Japanese Culture Society, a nonprofit that holds conventions around the Japanese cartoon art of anime, something Mr. Copeland became interested in after his accident.
Right after the accident he had enrolled himself on Pitt’s registry of patients willing to participate in clinical trials. Nearly a decade later, the Pitt research team asked if he was interested in participating in the experimental study.
After he passed the screening tests, Nathan was wheeled into the operating room last spring. Study co-investigator and UPMC neurosurgeon Elizabeth Tyler-Kabara, M.D., Ph.D., assistant professor, Department of Neurological Surgery, Pitt School of Medicine, implanted four tiny microelectrode arrays each about half the size of a shirt button in Nathan’s brain. Prior to the surgery, imaging techniques were used to identify the exact regions in Mr. Copeland’s brain corresponding to feelings in each of his fingers and his palm.
“I can feel just about every finger—it’s a really weird sensation,” Mr. Copeland said about a month after surgery. “Sometimes it feels electrical and sometimes its pressure, but for the most part, I can tell most of the fingers with definite precision. It feels like my fingers are getting touched or pushed.”
At this time, Mr. Copeland can feel pressure and distinguish its intensity to some extent, though he cannot identify whether a substance is hot or cold, explains Dr. Tyler-Kabara.
Michael Boninger, M.D., professor of physical medicine and rehabilitation at Pitt, and senior medical director of post-acute care for the Health Services Division of UPMC, recounted how the Pitt team has achieved milestone after milestone, from a basic understanding of how the brain processes sensory and motor signals to applying it in patients
“Slowly but surely, we have been moving this research forward. Four years ago we demonstrated control of movement. Now Dr. Gaunt and his team took what we learned in our tests with Tim and Jan—for whom we have deep gratitude—and showed us how to make the robotic arm allow its user to feel through Nathan’s dedicated work,” said Dr. Boninger, also a co-author on the research paper.
Dr. Gaunt explained that everything about the work is meant to make use of the brain’s natural, existing abilities to give people back what was lost but not forgotten.
“The ultimate goal is to create a system which moves and feels just like a natural arm would,” says Dr. Gaunt. “We have a long way to go to get there, but this is a great start.”
ASU researcher creates system to control robots with the brain
A researcher at Arizona State University has discovered how to control multiple robotic drones using the human brain.
A controller wears a skull cap outfitted with 128 electrodes wired to a computer. The device records electrical brain activity. If the controller moves a hand or thinks of something, certain areas light up.
“I can see that activity from outside,” said Panagiotis Artemiadis (pictured above), director of the Human-Oriented Robotics and Control Lab and an assistant professor of mechanical and aerospace engineering in the School for Engineering of Matter, Transport and Energy in the Ira A. Fulton Schools of Engineering. “Our goal is to decode that activity to control variables for the robots.”
If the user is thinking about decreasing cohesion between the drones — spreading them out, in other words — “we know what part of the brain controls that thought,” Artemiadis said.
A wireless system sends the thought to the robots. “We have a motion-capture system that knows where the quads are, and we change their distance, and that’s it,” he said.
Up to four small robots, some of which fly, can be controlled with brain interfaces. Joysticks don’t work, because they can only control one craft at a time.
Scientists working at Korea University, Korea, and TU Berlin, Germany have developed a brain-computer control interface for a lower limb exoskeleton by decoding specific signals from within the user’s brain.
Using an electroencephalogram (EEG) cap, the system allows users to move forwards, turn left and right, sit and stand simply by staring at one of five flickering light emitting diodes (LEDs).
The results are published today (Tuesday 18th August) in the Journal of Neural Engineering.
Each of the five LEDs flickers at a different frequency, and when the user focusses their attention on a specific LED this frequency is reflected within the EEG readout. This signal is identified and used to control the exoskeleton.
A key problem has been separating these precise brain signals from those associated with other brain activity, and the highly artificial signals generated by the exoskeleton.
“Exoskeletons create lots of electrical ‘noise’” explains Klaus Muller, an author on the paper. “The EEG signal gets buried under all this noise – but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal.”
Although the paper reports tests on healthy individuals, the system has the potential to aid sick or disabled people.
“People with amyotrophic lateral sclerosis (ALS) [motor neuron disease], or high spinal cord injuries face difficulties communicating or using their limbs” continues Muller. “Decoding what they intend from their brain signals could offer means to communicate and walk again.”
The control system could serve as a technically simple and feasible add-on to other devices, with EEG caps and hardware now emerging on the consumer market.