For some patients, the ‘inner voice’ may soon be audible – The Times of India



For decades, neuro-engineers have dreamed of helping people who have been cut off from the world of language. A disease like amyotrophic lateral sclerosis, or ALS, weakens the muscles in the airway. A stroke can kill neurons that normally relay commands for speaking. Perhaps, by implanting electrodes, scientists could record the brain’s electric activity and translate that into spoken words.Now a team of researchers has made an important advance toward that goal. Previously they succeeded in decoding the signals produced when people tried to speak. In the new study, published Thursday in the journal Cell, their computer often made correct guesses when the subjects simply imagined saying words.Christian Herff, a neuroscientist at Maastricht University in the Netherlands who was not involved in the research, said the result went beyond the merely technological and shed light on the mystery of language. “It’s a fantastic advance,” Herff said. The new study is the latest result in a long-running clinical trial, called BrainGate2, that has already seen some remarkable successes.One participant, Casey Harrell, now uses his brain-machine interface to hold conversations. In 2023, after ALS had made his voice unintelligible, Harrell agreed to have electrodes implanted in his brain. A computer recorded the electrical activity from the implants as Harrell attempted to say different words. Over time, with the help of AI, the computer predicted 6,000 words, with 97.5% accuracy.But successes like this raised a troubling question: Could a computer accidentally record more than patients actually wanted to say?Could it eavesdrop on their inner voice? “We wanted to investigate if there was a risk of the system decoding words that weren’t meant to be said aloud,” said Erin Kunz, a neuroscientist at Stanford University and an author of the study. She and her colleagues also wondered if patients might actually prefer using inner speech.Kunz and her colleagues decided to investigate the mystery for themselves. The scientists gave participants seven different words, including “kite” and “day,” then compared the brain signals when participants attempted to say the words and when they only imagined saying them.As it turned out, imagining a word produced a pattern of activity similar to that of trying to say it, but the signal was weaker. The computer did a good job of predicting which of the seven words the participants were thinking. For Harrell, it didn’t do much better than a random guess would have, but for another participant it picked the right word more than 70% of the time.The researchers put the computer through more training, this time specifically on inner speech. Its performance improved significantly, including on Harrell. Now when the participants imagined saying entire sentences, such as “I don’t know how long you’ve been here,” the computer could accurately decode most of the words.Herff, who has done his own studies, was surprised that the experiment succeeded. Before, he would have said that inner speech is fundamentally different from the motor cortex signals that produce actual speech. “But in this study, they show that, for some people, it isn’t that different,” he said.Kunz emphasized that the computer’s current performance involving inner speech would not be good enough to let people hold conversations. “The results are an initial proof of concept more than anything,” she said. But she is optimistic that decoding inner speech could become the new standard for brain-computer interfaces. In recent trials, she and her colleagues have improved the computer’s accuracy. “We haven’t hit the ceiling yet,” she said. NYT


Source link


Leave a Reply

Your email address will not be published. Required fields are marked *