Breakthroughs in Bilingual Brain Implantation - Daily Dose Documentary

Breakthroughs in Bilingual Brain Implantation

Bilingual Brain Implants

Since the advent of smaller, faster, more powerful microchips and the emergence of Artificial Intelligence or AI, neuroscientists have made impressive strides in restoring movement, speech and tactility in para and quadriplegic patients who previously had little hope of even a semblance of their once active lives. In most cases, micro-electrode arrays are surgically implanted on the patient’s sensory cortex, which is the region of the brain responsible for tactile sensations such as pressure, touch, as well as speech commands produced in the Wernicke’s area of the frontal lobe. Patients are then connected to a brain-machine interface or BMI, which allows a patient to control neuroprosthetic limbs, exoskeletons and their own arms and legs, with the goal of restoring mobility and verbal speech.

The Breakthrough

In a groundbreaking study published by the New England Journal of Medicine in 2021, neurosurgeon Edward Chang at UC San Francisco Medical Center surgically implanted electrodes in a patient nicknamed Pancho, who had suffered a stroke at age twenty that left him paralyzed over much of his body, while his only remaining form of audible communication was through grunts and groans. In his thirties at the time of his surgery, the implants recorded Pancho’s neural activity and translated his thoughts into words on a screen. His first words, “My family is outside,” was interpreted in English, which was a language he learned after his stroke, but since his native language was Spanish, the empathetic research team at UCSF sought to restore his connection and identity to his native tongue. To achieve their goal, the research team developed an AI system to decipher Pancho’s thoughts in both English and Spanish.

Competing AI Modules

Perfected by Chang’s postdoctorate student Alexander Silva, Silva trained his system on 200 words spoken by Pancho, with each new word adding a distinct neural pattern and data set, which was then recorded from his implanted electrodes in two separate AI modules—one for English and one for Spanish. The dueling modules then built phrases by comparing and refining results from both languages. Decoding accuracy proved to be 88% on the basis of the first word spoken and 75% for sentences and phrases, at generating speeds up to 78 words per minute. The advance has further inspired the group to transmit the decoded neural signals to a facially responsive avatar that generates audible synthetic speech, making breakthroughs in bilingual brain implantation, a bold new hope for speech limited patients everywhere.