It may be too fictional to imagine, but scientists in the U.S. already developed a system that can brain activity closer to reality.
David A. Moses, Joseph G. Makin, and Edward F. Chang developed a system for decoding an electrocorticogram with high accuracy and natural-speech rates. It aims to generate written text from decoded brain activity.
The researchers trained a network to encode an abstract representation from sentence-length order of neural activity that will be then decoded word by word into an English sentence.
It currently works on neural patterns of people as they speak aloud. However, in time it can also be useful for communicating with people who are unable to talk nor move, like those paralyzed by the Locked-in syndrome.
System Check: The Process of Machine Translation as Told by Makin
Given the benefits above, the co-author of the research, Dr. Joseph Makin mentioned that the A.I. could be a basis for speech prosthesis. To examine its effectiveness, Makin with the team developed the system accordingly:
- They recruited four participants and had electrode arrays implanted on their brain to monitor epileptic seizures.
- Each participant read 30-50 sentences aloud multiple times. During this time, the team traced their brain activity as they are talking.
- The research team fed the data into an A.I. approach called a machine-learning algorithm that transforms the neural activity data for sentences read aloud into a strand of numbers.
- They examined the string of numbers through the second part of the system that converts it into a sequence of words.
The system compared the actual recorded video with the sounds predicted from small pieces of brain activity. Initially, it resulted in sentences that make no sense. The outcomes improved as they continue to compare each pattern of words with the actual 50 sentences read aloud.
It also showed how numbers are strongly related to words. Together with his team, Makin then decided to reveal this research from their university in San Francisco, California, through the Natural Science journal.
They mentioned some limitations of the project:
- The system was not perfect, as there were a lot of words mistakenly decoded.
- The A.I. only handles a limited number of 50 sentences. Going beyond that will make the decoding worse.
- The system includes implanting brain electrodes, so it involves ethical issues.
- Imagined speech is different from the inner voice.
However, this system’s accuracy is higher than those for human transcribers (3% vs. 5% word error rate). As a professional in the subject, Dr. Christian Herff added that the training used less than 40 minutes, which is better than the typical long hours.
Though it seems almost perfect, still you have to note from Dr. Mahnaz Arvaneh that we are still very far from that situation where machines can read our minds. However, on a positive note, she mentioned that we can always think and plan about its possibilities. After all, this is more than a translation of thought, and artificial intelligence opens many opportunities out there.
Are you a medical professional who’d like to develop a system that can monitor human brain activity, too?
Mr. Jaycee De Guzman holds a degree in Computer Science. The machine language is his favorite among the several languages he can fluently speak and write with. As a self-taught computer scientist, he is into computer science, computer engineering, artificial intelligence, game development, space technology, and medical technology. He is also an entrepreneur with businesses in several niches such as, but not limited to, digital marketing, finance, agriculture, and technology.