Audiovisual task switching rapidly modulates sound encoding in mouse auditory cortex
Data files
Aug 31, 2022 version files 924.97 MB
-
mouse117_4_rec2019-04-22_17-46-20.mat
-
mouse121_3_rec2019-06-16_15-39-19.mat
-
mouse121_5_rec2019-06-17_14-44-10.mat
-
mouse121_5_rec2019-06-20_13-42-50.mat
-
mouse121_5_rec2019-06-21_14-16-47.mat
-
mouse126_4_rec2019-09-07_16-11-50.mat
-
mouse126_4_rec2019-09-12_17-06-28.mat
-
mouse128_2_rec2019-12-10_13-10-50.mat
-
mouse131_1_rec2020-04-24_16-18-28.mat
-
mouse131_1_rec2020-04-25_16-30-59.mat
-
mouse133_1_rec2020-04-11_16-59-19.mat
-
mouse133_1_rec2020-04-12_15-46-35.mat
-
mouse133_1_rec2020-04-13_17-01-55.mat
-
mouse133_1_rec2020-04-16_16-47-20.mat
-
mouse133_3_rec2020-04-18_16-27-16.mat
-
mouse133_3_rec2020-04-20_16-49-18.mat
-
mouse133_3_rec2020-04-23_18-04-53.mat
-
mouse134_3_rec2020-05-14_16-23-40.mat
-
mouse134_3_rec2020-05-17_14-57-24.mat
-
mouse134_3_rec2020-05-18_15-16-44.mat
-
mouse134_3_rec2020-05-19_16-35-43.mat
-
mouse137_2_rec2020-06-08_16-40-23.mat
-
mouse137_2_rec2020-06-11_16-05-45.mat
-
README.txt
Abstract
In everyday behavior, sensory systems are in constant competition for attentional resources, but the cellular and circuit-level mechanisms of modality-selective attention remain largely uninvestigated. We conducted translaminar recordings in mouse auditory cortex (AC) during an audiovisual (AV) attention-shifting task. Attending to sound elements in an AV stream reduced both pre-stimulus and stimulus-evoked spiking activity, primarily in deep layer neurons. Despite reduced spiking, stimulus decoder accuracy was preserved, suggesting improved sound encoding efficiency. Similarly, task-irrelevant probe stimuli during intertrial intervals evoked fewer spikes without impairing stimulus encoding, indicating that these attention influences generalized beyond training stimuli. Importantly, these spiking reductions predicted trial-to-trial behavioral accuracy during auditory attention, but not visual attention. Together, these findings suggest auditory attention facilitates sound discrimination by filtering sound-irrelevant spiking in AC, and that the deepest cortical layers may serve as a hub for integrating extramodal contextual information.
Methods
This dataset consists of extracellular physiology and behavioral measures recorded while mice performed an audiovisual rule-switching task (see corresponding manuscript). The task is separated into blocks, defined by their modality-specific rule. During the auditory rule (A-rule), mice used sound stimuli (tone clouds) to make decisions, while ignoring simultaneously presented visual stimuli (drifting gratings). During the visual rule (V-rule), mice used visual stimuli while ignoring sounds.
The physiology data comprises single unit spike times and waveforms, recorded using 64 channel linear silicon probes and an Intan acquisition system and spike sorted using KiloSort2 (Stringer et al. 2019). Behavioral lick response during the task was recorded using a photobeam lickometer. Animal locomotion was recorded with an optical sensor measuring movement of a floating treadmill. Pupillometry was used to determine engagement during the task and was measured from videos recorded of the left eye during the task. In between task trials, a random double sweep stimulus (Gourevitch et al. 2015) was used to map spectrotemporal receptive fields.
Data are included only for behavior sessions that met performance criterion (n = 23 from 10 mice): d' sensitivity index of at least 1.5 for both auditory and visual rules and a false alarm rate of <0.5 on the stimuli with conflicting reward values across rules (AUVR in A-rule and ARVU in V-rule).
Usage notes
Data is provided in MAT-File format to be opened with MATLAB or Octave. A ReadMe file (README.txt) is included with full descriptions of variables.