In the UMC Utrecht a brain implant has been placed in a patient enabling her to operate a speech computer with her mind.
The researchers and the patient worked intensively to get the settings right. She can now communicate at home with her family and caregivers via the implant. That a patient can use this technique at home is unique in the world. This research was published in the New England Journal of Medicine.
Because she suffers from ALS disease, the patient is no longer able to move and speak. Doctors placed electrodes in her brain, and the electrodes pick up brain activity. This enables her to wirelessly control a speech computer that she now uses at home.
“This is a major breakthrough in achieving autonomous communication among severely paralyzed patients whose paralysis is caused by either ALS, a cerebral hemorrhage or trauma,” says Professor Nick Ramsey, professor of cognitive neuroscience at the University Medical Center (UMC) Utrecht. “In effect, this patient has had a kind of remote control placed in her head, which enables her to operate a speech computer without the use of her muscles.”
The patient operates the speech computer by moving her fingers in her mind. This changes the brain signal under the electrodes. That change is converted into a mouse click. On a screen in front of her she can see the alphabet, plus some additional functions such as deleting a letter or word and selecting words based on the letters she has already spelled. The letters on the screen light up one by one. She selects a letter by influencing the mouse click at the right moment with her brain. That way she can compose words, letter by letter, which are then spoken by the speech computer. This technique is comparable to actuating a speech computer via a push-button (with a muscle that can still function, for example, in the neck or hand). So now, if a patient lacks muscle activity, a brain signal can be used instead.
The patient underwent surgery during which electrodes were placed on her brain through tiny holes in her skull. A small transmitter was then placed in her body below her collarbone. This transmitter receives the signals from the electrodes via subcutaneous wires, amplifies them and transmits them wirelessly. The mouse click is calculated from these signals, actuating the speech computer. The patient is closely supervised. Shortly after the operation, she started on a journey of discovery together with the researchers to find the right settings for the device and the perfect way to get her brain activity under control. It started with a “simple” game to practice the art of clicking. Once she mastered clicking, she focused on the speech computer. She can now use the speech computer without the help of the research team.
The UMC Utrecht Brain Center has spent many years researching the possibility of controlling a computer by means of electrodes that capture brain activity. Working with a speech computer driven by brain signals measured with a bathing cap with electrodes has long been tested in various research laboratories. That a patient can use the technique at home, through invisible, implanted electrodes, is unique in the world.
If the implant proves to work well in three people, the researchers hope to launch a larger, international trial. Ramsey: “We hope that these results will stimulate research into more advanced implants, so that some day not only people with communication problems, but also people with paraplegia, for example, can be helped.”
Recent research shows Brain-to-text device capable of decoding speech from brain signals
Ever wonder what it would be like if a device could decode your thoughts into actual speech or written words? While this might enhance the capabilities of already existing speech interfaces with devices, it could be a potential game-changer for those with speech pathologies, and even more so for “locked-in” patients who lack any speech or motor function.
“So instead of saying ‘Siri, what is the weather like today’ or ‘Ok Google, where can I go for lunch?’ I just imagine saying these things,” explains Christian Herff, author of a review recently published in the journal Frontiers in Human Neuroscience.
While reading one’s thoughts might still belong to the realms of science fiction, scientists are already decoding speech from signals generated in our brains when we speak or listen to speech.
In their review, Herff and co-author, Dr. Tanja Schultz, compare the pros and cons of using various brain imaging techniques to capture neural signals from the brain and then decode them to text.
The technologies include functional MRI and near infrared imaging that can detect neural signals based on metabolic activity of neurons, to methods such as EEG and magnetoencephalography (MEG) that can detect electromagnetic activity of neurons responding to speech. One method in particular, called electrocorticography or ECoG, showed promise in Herff’s study.
This study presents the Brain-to-text system in which epilepsy patients who already had electrode grids implanted for treatment of their condition participated. They read out texts presented on a screen in front of them while their brain activity was recorded. This formed the basis of a database of patterns of neural signals that could now be matched to speech elements or “phones”.
When the researchers also included language and dictionary models in their algorithms, they were able to decode neural signals to text with a high degree of accuracy. “For the first time, we could show that brain activity can be decoded specifically enough to use ASR technology on brain signals,” says Herff. “However, the current need for implanted electrodes renders it far from usable in day-to-day life.”
So, where does the field go from here to a functioning thought detection device? “A first milestone would be to actually decode imagined phrases from brain activity, but a lot of technical issues need to be solved for that,” concedes Herff.
Imagine being in an accident that leaves you unable to feel any sensation in your arms and fingers. Now imagine regaining that sensation, a decade later, through a mind-controlled robotic arm that is directly connected to your brain.
That is what 28-year-old Nathan Copeland experienced after he came out of brain surgery and was connected to the Brain Computer Interface (BCI), developed by researchers at the University of Pittsburgh and UPMC. In a study published online today in Science Translational Medicine, a team of experts led by Robert Gaunt, Ph.D., assistant professor of physical medicine and rehabilitation at Pitt, demonstrated for the first time ever in humans a technology that allows Mr. Copeland to experience the sensation of touch through a robotic arm that he controls with his brain.
“The most important result in this study is that microstimulation of sensory cortex can elicit natural sensation instead of tingling,” said study co-author Andrew B. Schwartz, Ph.D., distinguished professor of neurobiology and chair in systems neuroscience, Pitt School of Medicine, and a member of the University of Pittsburgh Brain Institute. “This stimulation is safe, and the evoked sensations are stable over months. There is still a lot of research that needs to be carried out to better understand the stimulation patterns needed to help patients make better movements.”
This is not the Pitt-UPMC team’s first attempt at a BCI. Four years ago, study co-author Jennifer Collinger, Ph.D., assistant professor, Pitt’s Department of Physical Medicine and Rehabilitation, and research scientist for the VA Pittsburgh Healthcare System, and the team demonstrated a BCI that helped Jan Scheuermann, who has quadriplegia caused by a degenerative disease. The video of Scheuermann feeding herself chocolate using the mind-controlled robotic arm was seen around the world. Before that, Tim Hemmes, paralyzed in a motorcycle accident, reached out to touch hands with his girlfriend.
But the way our arms naturally move and interact with the environment around us is due to more than just thinking and moving the right muscles. We are able to differentiate between a piece of cake and a soda can through touch, picking up the cake more gently than the can. The constant feedback we receive from the sense of touch is of paramount importance as it tells the brain where to move and by how much.
For Dr. Gaunt and the rest of the research team, that was the next step for the BCI. As they were looking for the right candidate, they developed and refined their system such that inputs from the robotic arm are transmitted through a microelectrode array implanted in the brain where the neurons that control hand movement and touch are located. The microelectrode array and its control system, which were developed by Blackrock Microsystems, along with the robotic arm, which was built by Johns Hopkins University’s Applied Physics Lab, formed all the pieces of the puzzle.
In the winter of 2004, Mr. Copeland, who lives in western Pennsylvania, was driving at night in rainy weather when he was in a car accident that snapped his neck and injured his spinal cord, leaving him with quadriplegia from the upper chest down, unable to feel or move his lower arms and legs, and needing assistance with all his daily activities. He was 18 and in his freshman year of college pursuing a degree in nanofabrication, following a high school spent in advanced science courses.
He tried to continue his studies, but health problems forced him to put his degree on hold. He kept busy by going to concerts and volunteering for the Pittsburgh Japanese Culture Society, a nonprofit that holds conventions around the Japanese cartoon art of anime, something Mr. Copeland became interested in after his accident.
Right after the accident he had enrolled himself on Pitt’s registry of patients willing to participate in clinical trials. Nearly a decade later, the Pitt research team asked if he was interested in participating in the experimental study.
After he passed the screening tests, Nathan was wheeled into the operating room last spring. Study co-investigator and UPMC neurosurgeon Elizabeth Tyler-Kabara, M.D., Ph.D., assistant professor, Department of Neurological Surgery, Pitt School of Medicine, implanted four tiny microelectrode arrays each about half the size of a shirt button in Nathan’s brain. Prior to the surgery, imaging techniques were used to identify the exact regions in Mr. Copeland’s brain corresponding to feelings in each of his fingers and his palm.
“I can feel just about every finger—it’s a really weird sensation,” Mr. Copeland said about a month after surgery. “Sometimes it feels electrical and sometimes its pressure, but for the most part, I can tell most of the fingers with definite precision. It feels like my fingers are getting touched or pushed.”
At this time, Mr. Copeland can feel pressure and distinguish its intensity to some extent, though he cannot identify whether a substance is hot or cold, explains Dr. Tyler-Kabara.
Michael Boninger, M.D., professor of physical medicine and rehabilitation at Pitt, and senior medical director of post-acute care for the Health Services Division of UPMC, recounted how the Pitt team has achieved milestone after milestone, from a basic understanding of how the brain processes sensory and motor signals to applying it in patients
“Slowly but surely, we have been moving this research forward. Four years ago we demonstrated control of movement. Now Dr. Gaunt and his team took what we learned in our tests with Tim and Jan—for whom we have deep gratitude—and showed us how to make the robotic arm allow its user to feel through Nathan’s dedicated work,” said Dr. Boninger, also a co-author on the research paper.
Dr. Gaunt explained that everything about the work is meant to make use of the brain’s natural, existing abilities to give people back what was lost but not forgotten.
“The ultimate goal is to create a system which moves and feels just like a natural arm would,” says Dr. Gaunt. “We have a long way to go to get there, but this is a great start.”
ASU researcher creates system to control robots with the brain
A researcher at Arizona State University has discovered how to control multiple robotic drones using the human brain.
A controller wears a skull cap outfitted with 128 electrodes wired to a computer. The device records electrical brain activity. If the controller moves a hand or thinks of something, certain areas light up.
“I can see that activity from outside,” said Panagiotis Artemiadis (pictured above), director of the Human-Oriented Robotics and Control Lab and an assistant professor of mechanical and aerospace engineering in the School for Engineering of Matter, Transport and Energy in the Ira A. Fulton Schools of Engineering. “Our goal is to decode that activity to control variables for the robots.”
If the user is thinking about decreasing cohesion between the drones — spreading them out, in other words — “we know what part of the brain controls that thought,” Artemiadis said.
A wireless system sends the thought to the robots. “We have a motion-capture system that knows where the quads are, and we change their distance, and that’s it,” he said.
Up to four small robots, some of which fly, can be controlled with brain interfaces. Joysticks don’t work, because they can only control one craft at a time.
Scientists working at Korea University, Korea, and TU Berlin, Germany have developed a brain-computer control interface for a lower limb exoskeleton by decoding specific signals from within the user’s brain.
Using an electroencephalogram (EEG) cap, the system allows users to move forwards, turn left and right, sit and stand simply by staring at one of five flickering light emitting diodes (LEDs).
The results are published today (Tuesday 18th August) in the Journal of Neural Engineering.
Each of the five LEDs flickers at a different frequency, and when the user focusses their attention on a specific LED this frequency is reflected within the EEG readout. This signal is identified and used to control the exoskeleton.
A key problem has been separating these precise brain signals from those associated with other brain activity, and the highly artificial signals generated by the exoskeleton.
“Exoskeletons create lots of electrical ‘noise’” explains Klaus Muller, an author on the paper. “The EEG signal gets buried under all this noise – but our system is able to separate not only the EEG signal, but the frequency of the flickering LED within this signal.”
Although the paper reports tests on healthy individuals, the system has the potential to aid sick or disabled people.
“People with amyotrophic lateral sclerosis (ALS) [motor neuron disease], or high spinal cord injuries face difficulties communicating or using their limbs” continues Muller. “Decoding what they intend from their brain signals could offer means to communicate and walk again.”
The control system could serve as a technically simple and feasible add-on to other devices, with EEG caps and hardware now emerging on the consumer market.