Scientists Improve Brain’s Decoder to Translate Thoughts into Text
Favicon 
anomalien.com

Scientists Improve Brain’s Decoder to Translate Thoughts into Text

Researchers at the University of Texas at Austin have unveiled an updated version of a brain decoder that uses artificial intelligence to convert thoughts into text. The primary advantage of the new algorithm is its ability to train an existing decoder without requiring hours of training. In a study published on February 6 in the journal Current Biology, the team explored how a decoder trained on one group of people could be adapted to work on the brains of other participants. This advancement could have a significant impact on individuals with aphasia, a disorder that impairs communication abilities. Previously, decoders required extensive training, with participants listening to stories while inside an MRI machine. However, this approach limited their utility, as the models only functioned effectively for the individuals they were trained on. As study co-author Alexander Huth explained, people with aphasia often struggle to understand and produce speech, rendering the traditional method ineffective. In the new study, researchers trained a decoder on several reference participants by collecting fMRI data while they listened to 10 hours of radio broadcasts. They then developed two conversion algorithms: one using data from 70 minutes of radio listening and another using data from 70 minutes of watching Pixar short films. Using a technique called functional alignment, the team analyzed how participants’ brains responded to the same audio and video clips. This allowed them to train a decoder for a target group without needing to collect hours of additional data. The decoders were tested using a short story that participants had not heard before. While the accuracy of the predictions was higher for the control group, the results still demonstrated a semantic connection between the predicted words and the story’s text. For example, in one part of the test story, a character reflects on a job he dislikes. A decoder trained on movie data predicted, “I worked in a job that I thought was boring. I had to take orders, and I didn’t like them.” Although the prediction wasn’t perfectly accurate, the underlying ideas were closely aligned. Researchers noted that the most intriguing aspect is the decoder’s ability to function even without language-specific data, opening new possibilities for information collection. This could help individuals with aphasia express their thoughts and underscores the similarities in how ideas are represented in both language and visual imagery. The next step for the research team will be to test the algorithm on participants with aphasia and develop an interface to help them generate the phrases they need. The post Scientists Improve Brain’s Decoder to Translate Thoughts into Text appeared first on Anomalien.com.