Scientists have managed to take a surprising step in the direction of decoding thoughts, demonstrating that a computer connected to electrodes implanted in the brain can interpret not only the words spoken or those a person trying to say, but also the only thought, without the intention of being spoken.
Thoughts can be read photo: Aevat (archive)
For decades, scientists have been dreaming of playing the voice of people who have lost their ability to talk like patients with amyotrophic lateral sclerosis (SLA) or with brain injury following a stroke.
The idea is simple, but ambitious: if the muscles can no longer speak, it can be the brain “read” Direct, and thoughts transformed into words?
Now, a team of researchers has taken an important step towards this goal. Previously, they managed to decode the signals produced when people were trying to speak, writes News.
In the new study, published on Thursday in Cell magazine, the computer often guessed the words that the subjects were thinking. This test is part of a larger, long-term study, conducted at UC Davis Health, called Braingate2, which has already had remarkable successes.
Scientists have implanted tiny electrodes in the brain of volunteers, in an area called Cortex Motor, responsible for sending commands to the muscles involved in speech.
A computer connected to these electrodes recorded the electrical signals generated when the volunteers were trying to say certain words. Using artificial intelligence (AI), the system has learned to recognize almost 6,000 words with an accuracy of over 97% and even play them with the patient’s voice.
The novelty brought by the recent study is that the system can “guess“Correct and words that the patient only thinks, without the intention of saying. Basically, when we think of a word or a sentence, the brain produces an activity pattern similar to the one created when we try to speak, only that weaker.
After additional training, the computer managed to decode even whole sentences designed by volunteers, not just isolated words.
This could be extremely useful for patients who get tired quickly when trying to speak, because they would eliminate physical effort. But at the same time, he raises problems about the intimacy of thoughts: if the technology can access what we think, how do we make sure that we only decide what we want to transmit?
The results show that the technology works, but it is still at the experiment stage. If it is perfected, it could change the lives of many people, but it must be accompanied by clear rules to protect the privacy of what we think.
Christian Herff, a neuro -certist at the Maastricht University in the Netherlands, who was not involved in research, told the New York Times that the result exceeds the pure technological part and sheds light on the mystery of language.
So far, the system has allowed a participant diagnosed with amyotrophic lateral sclerosis to “talk” through a real-time computer with his family, to change his intonation and to “sing” simple songs, after SLA affected his voice, making his speech unintelligible.
In 2023, he accepted the implantation of electrodes in the brain. Surgeons have placed four networks on the left side, in an area called Cortex Motor, when the brain gives controls to the muscles to produce the act of speech.
A computer recorded the electric activity while the man was trying to speak various words. With the help of AI, the computer predicted almost 6,000 words correctly, with an accuracy of 97.5%, and then synthesized them with his voice, based on records made before the disease.
But this success raised a question: could a computer record more than what the patient wants to say? Could “listen” the inner voice?
Dr. Erin Kunz, a neuro -certist at Stanford University, and his team wanted to find out if there is a risk that the system would decode words that were not meant to be said. They also wondered if patients would prefer to use interior speech, because they noticed that they were tired trying to speak, and only the imagination of sentences could be easier and would accelerate the process.
It was not clear whether they would manage to decode the inner speech and, until now, scientists have not fully agreed on its definition.
The brain produces language using several interconnected regions. The signals in the language network can be used to give controls to the muscles, for speech, sign language, but many people also have the feeling that they use the language to think, hearing their thoughts as an inner voice.
Some researchers claim that language is essential for thinking. But others, based on recent studies, believe that much of our thinking does not involve language at all and that people who hear an inner voice perceive, in fact, a kind of sporadic comment in their minds.
Kunz and colleagues gave the participants seven words, such as “kite“And” day “, and compared the brain signals as they spoke to them as they just thought.
The imagination of a word produced a similar, but weaker pattern. The computer predicted the word designed quite well, and after a specific training on interior speech, the accuracy increased, so that entire sentences could be decoded correctly.
The researchers were surprised, because they believed that the inner speech is fundamentally different from the signals of the motor cortex, but the study showed that for some people the difference is not great.
Dr. Kunz said that the performance currently achieved is not sufficient for real conversations, but it is optimistic that it could become the standard in brain-computer interfaces and that accuracy and speed will increase than recent studies.
As for mental intimacy, there have been cases in which the system detected words that participants did not think to speak.
In one experiment, participants showed them a screen full of 100 rectangles and pink and green circles. Then, they had to determine the number of shapes of a certain color, for example, green circles.
As participants solved the problem, the computer sometimes decoded the word corresponding to a number. Basically, the participants were counting the forms, and the computer “heard” them.
This suggests that language could be involved in many forms of thought.
In order to prevent the interception of private thoughts, the team proposed two solutions: the system should be set to decode only the intentional speech, not the interior, or the activation of the decoding to start only after a “mental password ”an unusual phrase, intentionally designed.
The researchers chose the Chitty Chitty Bang Bang phrase, and a participant managed to use it with an accuracy of 98.75%, so the decoding began only after the password recognition.
This approach could provide patients with greater control over shared information.
However, some experts are skeptical that an implant could surprise much of the spontaneous thoughts, as they are not usually well -closed phrases.
Although this study is rather a proof of the concept than a technology ready for use, it suggests that decoding thoughts is possible, and that it could become a viable option for people who cannot use their voice.
The success, however, brings to the forefront and major ethical challenges, such as protecting the intimacy of thoughts, which means that technological progress must be accompanied by rigorous regulations and safety guarantees.