Letter by letter a completely paralyzed man was able to express his feelings for the first time so that his family would be excited when he read the phrase: “I love my great son”.
The patient goes through the final stage of neurological disease amyotrophic lateral sclerosis (ALS), which leads to extreme isolation with the impossibility of communicating, including. People with this condition lose control of their muscles, and communication can become impossible. But with the help of an implanted device that reads the signals from his brain, this man was able to select letters and form sentences, according to the researchers who conducted the experiment.
“People have really doubted whether this was even feasible,” said Mariska Vansteensel, a brain-computer interface researcher at the Utrecht University Medical Center who did not participate in the study, published in Nature Communication s.” If the new spelling system is reliable for all people who are completely blocked, and if it can be made more efficient and affordable, it could allow them to reconnect with their families and care teams,” added Reinhold Scherer, neural engineer at the University of Essex.
ALS destroys the nerves that control movement, and most patients die within 5 years of diagnosis. When a person can no longer speak, they can use an eye-tracking camera to select letters on a screen. Later in the progression of the disease, they can answer yes or no questions with subtle eye movements. But if a person prolongs his life, he may spend months or years being able to listen but not communicate, since even the eyes cannot move.
In 2016, Vansteensel's team reported that a woman with ALS could spell sentences with a brain implant that detected attempts to move her hand. But this person still had minimal control of some muscles of the eyes and mouth. It was not clear whether a brain that has lost all control over the body can signal the intended movements consistently enough to allow meaningful communication.
Revolutionary communication
The patient participating in the new study, a man with ALS who is now 36 years old, started working with a research team at the University of Tübingen (Germany) in 2018, when he could still move his eyes. He told the team that he wanted an invasive implant to try to maintain communication with his family, including his young son. His wife and sister gave their written consent to the surgery.
“Consent to this type of study carries ethical challenges. This man would not have been able to change his mind or opt out during the period after his last communication of eye movement,” said Dr. Eran Klein, a neurologist and neuroethicist at the University of Washington, Seattle.
The researchers inserted two sets of square electrodes, 3.2 millimeters wide, into a part of the brain that controls movement. “When the man was asked to try to move his hands, feet, head and eyes, the neural signals weren't consistent enough to answer yes or no questions,” said Ujwal Chaudhary, biomedical engineer and neurotechnologist at the German non-profit organization ALS. Voice.
After nearly 3 months of unsuccessful efforts, the team tested neurofeedback, in which a person tries to modify their brain signals while getting a real-time measurement of whether they are succeeding. An audible tone became higher as the electrical firing of neurons near the implant accelerated, and lower as it slowed down.
The researchers asked the participant to change that tone using any strategy. On the first day, I could move the tone and, by day 12, I could match it to a target tone. “It was like music to the ear,” Chaudhary recalls. The researchers adjusted the system by looking for the most receptive neurons and determining how each one changed with the participants' efforts.
By keeping the pitch high or low, the man could indicate “yes” and “no” to groups of letters and then to individual letters. After about 3 weeks with the system, he produced an intelligible sentence: a request for caregivers to change his position. In the following year, he made dozens of sentences at a meticulous pace of about one character per minute: “Goulash soup and sweet pea soup.” “I'd like to hear Tool's album out loud.” “I love my cool son.”
He eventually explained to the team that he modulated the tone trying to move his eyes. But he didn't always succeed. Only in 107 of the 135 days reported in the study was it able to match a series of target tones with 80% accuracy, and only in 44 of those 107 was it able to produce an intelligible sentence.
“We can only speculate” about what happened the other days. The participant may have been asleep or simply not in the mood. Perhaps the signal from the brain was too weak or variable to optimally configure the computer's decoding system, which required daily calibration. The relevant neurons may have entered and left the range of the electrodes,” says co-author Jonas Zimmermann, neuroscientist at the Wyss Center for Bioengineering and Neuroengineering, in Switzerland.
Even so, the study shows that it is possible to maintain communication with a person as they lock themselves up by adapting an interface to their abilities, says Melanie Fried-Oken, who studies the brain-computer interface at the Oregon University of Science and Health. “It's so good. But hundreds of hours were spent designing, testing and maintaining the custom system, he says. We are nowhere near turning this into a state of assistive technology that a family could buy,” the experts clarified.
“The demonstration also raises ethical questions. Discussing end-of-life care preferences is quite difficult for people who can speak,” Klein notes. “Can you have one of those really complicated conversations with one of these devices that only allows you to say three sentences a day? You certainly don't want to misunderstand a word here or a word there.” Zimmermann says the research team stipulated that the participant's healthcare should not be interface-dependent. “If the output of the speller were to 'turn off my fan', we wouldn't do it. But it is up to the family members to interpret the patient's wishes as they see fit,” he clarified.
The Chaudhary Foundation is seeking funding to give similar implants to several other people with ALS. He estimates that the system would cost about $500,000 for the first 2 years. Meanwhile, Zimmermann and his colleagues are developing a signal processing device that attaches to the head via magnets instead of being anchored through the skin, which carries a risk of infection.
Until now, devices that read signals from outside the skull have not allowed spelling. In 2017, a team announced that it could classify with 70% accuracy the yes or no responses of the brain of a fully enclosed participant using a non-invasive technology called functional near-infrared spectroscopy (fNIRS). Two co-authors of the new study, Chaudhary and the neuroscientist at the University of Tübingen, Niels Birbaumer, were part of that team. But other researchers have expressed concern about the study's statistical analysis. Two investigations found misconduct in 2019 and two documents were retracted. The authors filed a lawsuit to challenge the findings of misconduct. Scherer, who was skeptical about the fNIRS study, says that the results with the invasive device are “definitely stronger”.
Researchers at the Wyss Center continue to work with this study participant, but his ability to spell has diminished and he now answers mostly yes or no questions, Zimmermann says.
“The scar tissue around the implant is partly to blame because it obscures neural signals. Cognitive factors could also play a role: the participant's brain may be losing the ability to control the device after years of not being able to affect its environment. But the research team is committed to maintaining the device as long as he continues to use it,” Zimmermann says. “There is this enormous responsibility. We are very aware of that,” he concluded.
KEEP READING: