Data from: Multi-gesture drag-and-drop decoding in a 2D iBCI control task
Data files
Apr 08, 2025 version files 8.23 GB
-
README.md
7.24 KB
-
sesData_DD_T11.mat
4.15 GB
-
sesData_GH_T11.mat
3.37 GB
-
sesData_GH_T5.mat
712.08 MB
Apr 08, 2025 version files 8.23 GB
-
README.md
7.26 KB
-
sesData_DD_T11.mat
4.15 GB
-
sesData_GH_T11.mat
3.37 GB
-
sesData_GH_T5.mat
712.08 MB
Abstract
Objective. Intracortical brain-computer interfaces (iBCIs) have demonstrated the ability to enable point-and-click as well as reach-and-grasp control for people with tetraplegia. However, few studies have investigated iBCIs during long-duration discrete movements that would enable common computer interactions such as ”click-and-hold” or ”drag-and-drop.”
Approach. Here, we examined the performance of multi-class and binary (attempt/no-attempt) classification of neural activity in the left precentral gyrus of two BrainGate2 clinical trial participants performing hand gestures for 1, 2, and 4 seconds in duration. We then designed a novel ”latch decoder” that utilizes parallel multi-class and binary decoding processes and evaluated its performance on data from isolated sustained gesture attempts and a multi-gesture drag-and-drop task.
Main Results. Neural activity during sustained gestures revealed a marked decrease in the discriminability of hand gestures sustained beyond 1 second. Compared to standard direct decoding methods, the latch decoder demonstrated substantial improvement in decoding accuracy for gestures performed independently or in conjunction with simultaneous 2D cursor control.
Significance. This work highlights the unique neurophysiological response patterns of sustained gesture attempts in human motor cortex and demonstrates a promising decoding approach that could enable individuals with tetraplegia to intuitively control a wider range of consumer electronics using an iBCI.
These data are released with the manuscript - Gusman, Hosman, et al, 2025, Multi-gesture drag-and-drop decoding in a 2D iBCI control task.
Dataset DOI: 10.5061/dryad.98sf7m0v1
Dataset Description
This dataset comprises intracortical neural signals recorded from two participants enrolled in the BrainGate2 pilot clinical trial (NCT00912041):
- T11: A 39-year-old man with tetraplegia due to a C4 AIS-B SCI that occurred 9 to 11 years prior to enrollment in the trial.
- T5: a 70-year-old man with tetraplegia due to a C4 AIS-C spinal cord injury that occurred 9 to 11 years prior to enrollment in the trial.
This research was conducted under an Investigational Device Exemption (IDE) granted by the US Food and Drug Administration (IDE #G090003; CAUTION: Investigational device. Limited by Federal law to investigational use).
All sessions took place at the participants’ residences.
BCI behavioral tasks
- Gesture Hero Task: An open-loop task wherein participants were asked to attempt and maintain one of a set of hand gestures (7 gestures for T11 and 3 gestures for T5) for 1, 2, or 4 seconds in duration. T11 performed the Gesture Hero task on two session days, each session consisting of 10 data blocks and 20 trials per gesture-duration condition. T5 performed the Gesture Hero task on one session day over 3 data blocks acquiring a total of 20 trials per gesture-duration condition. See manuscript for additional task details.
- Multi-Gesture Drag-and-Drop Task: The Drag-and-Drop Task consisted of a 2D center out and return task with four outer targets positioned cardinally from the center target. There were three trial variations present in each data collection block: “Move Only”, “Click”, and “Drag” trials. Each trial consisted of multiple stages: “Prepare”, “Center Out”, “Wait”, “Return”, and - for Drag Trials only - a “Hold” stage. Please see the manuscript for an in-depth description of this task. Participant T11 performed the Drag-and-Drop task during two sessions, each with a total of 11 blocks. Each block contained 4 Move Only trials, 12 Click trials, and 12 Drag trials. The first four blocks were used to calibrate the decoders (steady state Kalman filter for 2D kinematic decoding and the Latch decoder for gesture decoding) and the seven subsequent blocks were treated as assessment blocks.
Intracortical neural recordings and neural features
Each participant had two 96-channel microelectrode arrays (Blackrock Neurotech, Salt Lake City, UT) placed in the dominant (left) hand knob area of the precentral gyrus. Using custom Simulink (Mathworks) software, we performed online feature extraction of seven neural features (non-causal threshold crossings (TX), spike band power (SP), and five LFP bands) in 20ms time steps. This dataset includes the concatenated and z-scored (3 min rolling window) neural features for each of the 192 recording channels, resulting in a total of 1344 total features.
Files and Variables
This dataset contains three .mat files:
- sesData_GH_T11.mat: a 1x2 struct array with data from T11’s two Gesture Hero sessions
- sesData_GH_T5.mat: a 1x1 struct with data from T5’s one Gesture Hero session
- sesData_DD_T11.mat: a 1x2 struct array with data from T11’s two Drag-and-Drop sessions
Each struct contains the following fields:
participant
: [str] ‘t11’ or ‘t5’
sessionNum
: [1 x 1] Session number for this task/participant (1 or 2)
blocks
: [1 x nBlocks] Vector of block numbers that task data represents
feat
: [nStep x 1344] All feature data for this session (20 ms timesteps, all blocks concatenated)
featInfo
: [struct] Struct indicating columns of feat corresponding to each feat type
taskInfo
: [struct] Struct containing task information (see below)
outCL
: [struct] Drag-and-Drop tasks only; Additional outputs collected during CL control (see below)
taskInfo (Gesture Hero sessions)
startStops
: [nTrial x 2] Time step indices of Go (start) and Stop cues for each trial
noActionStartStops
: [nTrial x 2] Time step indices denoting beginning and end of non-trial period after each trial
labels
: [nTrial x 1] Categorical; gesture cued for each trial
blockNumber
: [nTrial x 1] Block number of each trial
trialDuration
: [nTrial x 1] Duration (in time steps - i.e. 50, 100, or 200) of each trial
prctNS5Outlier
: [nTrial x 1] Percent of each trial epoch containing NS5 (voltage data) outliers
taskInfo (Drag-and-Drop sessions)
startStops
: [nStage x 2] Time steps of beginning and end of each trial stage
onTargetStartStops
: {nStage x 1} Cell of arrays indicating time steps cursor was on target
cuedGesture
: [nStage x 1] Categorical; gesture participant was attempting during each trial stage
cuedDirection
: [nStage x 1] Categorical; movement direction participant was attempting during each stage
isDragAttempt
: [nStage x 1] Logical; is participant dragging (holding gesture) during stage
isClickAttempt
: [nStage x 1] Logical; is participant attempting a ‘click’ during this stage
isWait
: [nStage x 1] Logical; is this a ‘Wait’ stage
trialNum
: [nStage x 1] Trial number of each trial stage
trialStage
: {nStage x 1} Trial stage (string)
blockNumber
: [nStage x 1] Block number of each trial stage
trialAttemptType
: [nStage x 1] Categorical; trial type (‘Move’, ‘Click’, ‘Drag’, or ‘Intertrial’)
prctOutliersPerEpoch
: [nStage x 1] Percent of each trial stage containing NS5 (voltage data) outliers
kinErrorAttenuation
: [nStage x 1] Error attenuation (EA) used in this stage
isKinDecoderCL
: [nStage x 1] Logical; are kinematics under closed-loop control (incl. EA blocks)
isGestureDecoderCL
: [nStage x 1] Logical; are gestures under closed-loop control
imagery
: {1 x 4} Names of 4 possible gesture states (incl. ‘no_action’)
outCL (Drag-and-Drop sessions only)
decodedState
: [nStep x 1] Categorical; gesture state decoded at each time step
stateLL
: [nStep x 4] Gesture state likelihoods (normalized) at each time step
stateThresh
: [1 x 1] Likelihood threshold (0.9998) needed to decode gesture
isOnTarget
: [nSteps x 1] Timesteps cursor is on target (0 or 1)
wherein:
nStep = total number of 20 ms steps (bins) in session
nTrial = total number of trials in a given session
nStage = total number of trial stages in a Drag-and-Drop session
Code/software
The code for reproducing the manuscript figures is made available at https://github.com/jgusman/drag-and-drop.
Human subjects data
We confirm that the informed consent form signed by participants permits sharing of coded data that does not contain other identifying information. Neural data contain no personally identifiable information (PII). Filenames only have a code, but no PII.