Data from: Gesture encoding in human left precentral gyrus neuronal ensembles
Data files
Jun 24, 2025 version files 1.72 GB
-
Gesture_Encoding_processed_data.zip
1.72 GB
-
README.md
4.92 KB
Abstract
Understanding the cortical activity patterns driving dexterous upper limb motion has the potential to benefit a broad clinical population living with limited mobility through the development of novel brain-computer interface (BCI) technology. The present study examines the activity of ensembles of motor cortical neurons recorded using microelectrode arrays in the dominant hemisphere of two BrainGate clinical trial participants with cervical spinal cord injury as they attempted to perform a set of 48 different hand gestures. Although each participant displayed a unique organization of their respective neural latent spaces, it was possible to achieve classification accuracies of ~70% for all 48 gestures (and ~90% for sets of 10). Our results show that single-unit ensemble activity recorded in a single hemisphere of human precentral gyrus has the potential to generate a wide range of gesture-related signals across both hands, providing an intuitive and diverse set of potential command signals for intracortical BCI use.
Dataset DOI: 10.5061/dryad.cvdncjtcq
Description of the data and file structure
Analysis code for: Gesture Encoding in human left premotor cortex neuronal ensembles
These data are released with the manuscript: "Gesture encoding in human left precentral gyrus neuronal ensembles", Vargas-Irwin, Carlos, Tommy Hosman, Jacob T. Gusman, Tsam Kiu Pun, John D. Simeral, Tyler Singer-Clark, Anastasia Kapitonava, Ziv Williams, Jaimie M. Henderson, Leigh Hochberg. Communications Biology (Conditionally accepted)
Biorxiv Preprint: https://doi.org/10.1101/2024.08.23.608325
Corresponding author: Carlos Vargas-Irwin, Brown University https://orcid.org/0000-0002-3526-3754
Overview
This dataset contains processed neural activity recorded during these experiments, consisting of a total of 8,249 attempted hand gestures. Code to reproduce paper figures can be found here: https://github.com/cvargasi/GestureEncoding/
This dataset comprises intracortical neural signals recorded from two participants enrolled in the BrainGate2 pilot clinical trial (NCT00912041):
- T11: A 36-year-old man with tetraplegia due to a C4 AIS-B SCI that occurred 9 to 11 years prior to enrollment in the trial.
- T5: A 68-year-old man with tetraplegia due to a C4 AIS-C spinal cord injury that occurred 9 to 11 years prior to enrollment in the trial.
This research was conducted under an Investigational Device Exemption (IDE) granted by the US Food and Drug Administration (IDE #G090003; CAUTION: Investigational device. Limited by Federal law to investigational use).
All sessions took place at the participants’ residences.
Files and variables
File: Gesture_Encoding_processed_data.zip
Description: Variables & data organization
DataMatOL: Neural data collected during open-loop blocks. Neural data is stored in 3D matrices [K, T, F], where K = trial number, T = time bin, and F = neural feature. Neural features were either spiking activity (threshold crossings) of local field potential power in the 250 Hz - 5 kHz band (8th order IIR Butterworth) collected in 20ms bins and smoothed with a 200ms Gaussian kernel. Each trial corresponds to the execution of one intended gesture, including 50 bins (1 second) before and another 50 after the 'go' instruction, for a total of 100 bins (2 seconds). Only features that displayed significant task modulation were included in the analysis.
DataMatCL: Similar to DataMatOL, except for closed-loop control blocks
labels: data structure with informational labels for each trial of the 8249 trials in the concatenated data structure. Includes the following fields:
sessid: [8249×1 double] session # (1-8). Sessions 1-6 correspond to participant T11, 7 and 8 to T5. session ID 1: T11 trial day 219 2: T11 trial day 227 3: T11 trial day 248 4: T11 trial day 854 5: T11 trial day 856 6: T11 trial day 861 7: T5 trial day 2084 8: T5 trial day 2255
gest: [8249×1 string] the gesture cued for each trial
DGid: [8249×1 double] gesture group derived from T11 dendrogram (1-7) dendrogram group 1: Finger Flex, 2: Thumb, 3: Grasp, 4: Abduction, 5: Extension, 6: Wrist, 7: Do Nothing.
LRid: [8249×1 double] Left (1) or Right (2) hand
FlexExtid: [8249×1 double] Flexion (1) or extension (2)
gestNLR: [8249×1 string] Gesture labels that omit the Left / Right designations
Plus the following fields, which contain the names for specific groups (used to label figures):
DGnames: [ {'Finger Flex.'} {'Thumb'} {'Grasp'} {'Abduction'} {'Extension'} {'Wrist'} {'Do Nothing'} ]
LRnames: ['Left', 'Right']
FlexExt: ['Flexion,' 'Extension']
UNI_C: After the latent spaces are generated and aligned, they are stored in the UNI_C variable of size 8249 (number of trials) x 15 (number of latent space dimensions)
Code/software
The code for reproducing the manuscript figures is made available at https://github.com/cvargasi/GestureEncoding/
This Gesture_Encoding_Analysis.m script will run other functions in the repository to perform the following:
- Latent space generation from neural data
- Latent space alignment
- Generation of Gesture dendrograms based on the similarity of the neural activity patterns
- Visualization of Neural latent spaces (3D plots)
- Classification using the combined latent spaces
The script will load processed data in 'Gesture_Encoding_processed_data.mat'.
Human subjects data
We confirm that the informed consent form signed by participants permits sharing of coded data that does not contain other identifying information. Neural data contain no personally identifiable information (PII). Filenames only have a code, but no PII.
