Mia's Feed
Medical News & Research

Innovative Brain-Computer Interface Enables Speech Decoding and Computer Control for ALS Patients

Innovative Brain-Computer Interface Enables Speech Decoding and Computer Control for ALS Patients

Share this article

2 min read

Researchers at the University of California, Davis have achieved a groundbreaking advancement in neural technology by developing a brain-computer interface (BCI) capable of controlling a computer cursor and decoding speech directly from neural signals. This innovation is particularly significant for individuals with amyotrophic lateral sclerosis (ALS), a progressive neurological disease that results in paralysis and speech impairment while leaving cognitive functions intact.

The study involved a single participant with ALS, a 45-year-old man experiencing paralysis and difficulty speaking. Four electrode arrays, each with 64 contacts, were surgically implanted in the ventral precentral gyrus—an area of the brain involved in speech production—guided by advanced MRI techniques. These implants allowed the collection of neural signals at high sampling rates, which were processed in real-time to decode user intentions.

The BCI system was tested through various tasks, including calibrations, grid evaluations, and simultaneous speech and cursor control exercises. Utilizing machine learning algorithms, the system could interpret neural activity to control a cursor’s movement and clicks with high accuracy, achieving performance rates up to 3.16 bits per second. The participant demonstrated the ability to independently operate a personal computer and perform text entry, highlighting the system’s practical usability in everyday settings.

Importantly, the study revealed that neural signals from the speech motor cortex could support both communication via speech decoding and computer navigation through a single implant site. While simultaneous speech and cursor control increased target acquisition times, these findings underscore the potential for refined decoder algorithms to improve future system performance.

This research represents a significant step toward integrated neural interfaces that can restore communication and computer control for those with severe motor impairments. The success of using a single brain region to support multiple functions opens new avenues for developing multi-modal BCIs, promising to enhance independence and quality of life for patients with ALS and similar neurological conditions.

Published in the Journal of Neural Engineering, this study demonstrates the feasibility of multi-functional neural interfaces, providing hope for more accessible and versatile assistive technologies in the near future.

source: https://medicalxpress.com/news/2025-05-brain-interface-speech-decoding-als.html

Stay Updated with Mia's Feed

Get the latest health & wellness insights delivered straight to your inbox.

How often would you like updates?

We respect your privacy. Unsubscribe at any time.