Skip to main content
Dryad

Data from: Decoding intended speech with an intracortical brain-computer interface in a person with longstanding anarthria and locked-in syndrome

Data files

Mar 04, 2026 version files 18.98 GB

Click names to download individual files Select up to 11 GB of files for zip download

Abstract

Intracortical brain-computer interfaces (iBCIs) for decoding intended speech have provided individuals with ALS and severe dysarthria an intuitive method for high-throughput communication. These advances have been demonstrated in individuals who are still able to vocalize and move speech articulators. Here, we decoded intended speech from an individual with longstanding anarthria, locked-in syndrome, and ventilator dependence due to advanced symptoms of ALS. We found that phonemes, words, and higher-order language units could be decoded well above chance. While sentence decoding accuracy was below that of demonstrations in participants with dysarthria, we can attain an extensive characterization of the neural signals underlying speech in a person with locked-in syndrome and, through our results, identify several directions for future improvement. These include closed-loop speech imagery training and decoding linguistic (rather than phonemic) units from neural signals in the middle precentral gyrus. Overall, these results demonstrate that speech decoding from the motor cortex may be feasible in people with anarthria and ventilator dependence. For individuals with longstanding anarthria, a purely phoneme-based decoding approach may lack the accuracy necessary to support independent use as a primary means of communication; however, additional linguistic information embedded within neural signals may provide a route to augment the performance of speech decoders.