Data from: Area 2 of primary somatosensory cortex encodes kinematics of the whole arm
Data files
Jan 13, 2020 version files 4.02 GB
Abstract
Proprioception, the sense of body position, movement, and associated forces, remains poorly understood, despite its critical role in movement. Most studies of area 2, a proprioceptive area of somatosensory cortex, have simply compared neurons' activities to the movement of the hand through space. By using motion tracking, we sought to elaborate this relationship by characterizing how area 2 activity relates to whole arm movements. We found that a whole-arm model, unlike classic models, successfully predicted how features of neural activity changed as monkeys reached to targets in two workspaces. However, when we then evaluated this whole-arm model across active and passive movements, we found that many neurons did not consistently represent the whole arm over both conditions. These results suggest that 1) neural activity in area 2 includes representation of the whole arm during reaching and 2) many of these neurons represented limb state differently during active and passive movements.
Methods
For full methodological information, see:
[1] Chowdhury, R.H., Glaser, J.I., Miller, L.E. Area 2 of primary somatosensory cortex encodes kinematics of the whole arm. eLife (2019) doi: 10.7554/eLife.48198
Usage notes
This data set includes behavioral recordings and extracellular neural recordings from area 2 of primary somatosensory cortex of Rhesus macaques during two separate reaching experiments. Raeed Chowdhury collected and processed the data in the laboratory of Lee Miller for use in Chowdhury et al. 2019 (accepted in eLife as of 12/2019), which characterized how area 2 neurons represent reaching movements. Results and methodology from these experiments are described in [1].
In both experiments, monkeys controlled a cursor on a screen using a two link, planar manipulandum. In the first experiment, from which we include eight total sessions, monkeys reached to sequential, visually presented targets in one of two workspaces: one near the body on the contralateral side to the reaching arm and one far from the body on the ipsilateral side. In the second experiment, from which we include four total sessions, monkeys performed a simple center-out task, where on some random trials during the center-hold period, the manipulandum applied a perturbation to the monkey’s hand. During these reaching tasks, we tracked the locations of ten markers on the monkey’s arm, used to estimate joint angles and muscle lengths during the behavioral experiments. In addition to the behavioral data, we collected neural data from area 2 using Blackrock Utah multielectrode arrays, yielding ~100 channels of extracellular recordings per monkey. Recordings from these channels were thresholded online to detect spikes, which were sorted offline into putative single units.
In addition to the data from these experiments, we have also included data from several sensory mapping sessions with the three monkeys, where we characterized the sensory receptive fields of several electrodes on the arrays.
Analysis code used to produce figures for [1] provides useful examples for how to work with this dataset. See https://github.com/raeedcho/s1-kinematics.git for code and readme.
If you publish any work using the data, please cite the publication above ([1] Chowdhury et. al, 2019) and also cite this data set.