New York, May 1 (IANS): Imagine a brain-machine interface that does not just figure out what sounds you want to make, but what you want to say.
Researchers have taken a step in that direction by building a "semantic atlas" that shows in vivid colours how the brain organises different words by their meanings.
The atlas identifies brain areas that respond to words that have similar meanings.
Detailed maps showing how the brain organises different words by their meanings could eventually help give voice to those who cannot speak, such as people who have had a stroke, brain damage or motor neuron diseases such as ALS.
While mind-reading technology remains far off on the horizon, charting language organisation in the brain brings decoding inner dialogue a step closer to reality, the researchers said.
"This discovery paves the way for brain-machine interfaces that can interpret the meaning of what people want to express," said study lead author Alex Huth, postdoctoral researcher in neuroscience at University of California, Berkeley.
For example, clinicians could track the brain activity of patients who have difficulty communicating and then match that data to semantic language maps to determine what their patients are trying to express.
Another potential application is a decoder that translates what you say into another language as you speak.
The findings, published in the journal Nature, are based on a brain imaging study that recorded neural activity while study volunteers listened to stories from "The Moth Radio Hour" - a public radio show in which people recount humorous and poignant autobiographical experiences.
They showed that at least on third of the brain's cerebral cortex -- including areas dedicated to high-level cognition -- is involved in language processing.
Notably, the study found that different people share similar language maps.
"The similarity in semantic topography across different subjects is really surprising," Huth said.