Skip to main content
Dryad

Data for: A high-performance speech neuroprosthesis

Abstract

Brain-computer interfaces (BCIs) can restore communication to people who have lost the ability to move or speak. In this study, we demonstrated an intracortical BCI that decodes attempted speaking movements from neural activity in motor cortex and translates it to text in real-time, using a recurrent neural network decoding approach. With this BCI, our study participant, who can no longer speak intelligibly due to amyotrophic lateral sclerosis, achieved a 9.1% word error rate on a 50-word vocabulary and a 23.8% word error rate on a 125,000-word vocabulary. 

This dataset contains all of the neural activity recorded during these experiments, consisting of 12,100 spoken sentences as well as instructed delay experiments designed to investigate the neural representation of orofacial movement and speech production.

The data have also been formatted for developing and evaluating machine learning decoding methods, and we intend to host a decoding competition. To this end, the data also contain files for reproducing our offline decoding results, including a language model and an example RNN decoder. 

Code associated with the data can be found here: https://github.com/fwillett/speechBCI.