Fast cortical dynamics encode tactile grating orientation during active touch
Data files
Jun 03, 2021 version files 2.24 GB
-
Fig1data.pkl
1.50 GB
-
Fig1data.txt
1.19 KB
-
Fig2data.pkl
605.33 MB
-
Fig2data.txt
1.39 KB
-
Fig3_6_data.pkl
138.60 MB
-
Fig3_6data.txt
1.76 KB
Abstract
Touch-based object recognition relies on perception of compositional tactile features like roughness, shape, and surface orientation. However, besides roughness, it remains unclear how these different tactile features are encoded by neural activity that is linked with perception. Here, we establish a cortex-dependent perceptual task in which mice discriminate tactile gratings based on orientation using only their whiskers. Multi-electrode recordings in barrel cortex reveal weak orientation tuning in average firing rates during grating exploration despite high levels of cortical activity. Just before decision, orientation information extracted from fast cortical dynamics more closely resembles concurrent psychophysical measurements than single neuron orientation tuning curves. This temporal code conveys both stimulus and choice/action-related information, suggesting that fast cortical dynamics during exploration of a tactile object both reflect the physical stimulus and impact the decision.
Methods
Go-NoGo behavior, high speed videos of whisker interactions with gratings, and multi-electrode extracellular electrophysiology
Usage notes
The submission contains txt files with descriptions of the python dictionaries that contain all the relevant data.