Audiovisual task switching rapidly modulates sound encoding in mouse auditory cortex
Data files
Aug 31, 2022 version files 924.97 MB
Abstract
In everyday behavior, sensory systems are in constant competition for attentional resources, but the cellular and circuit-level mechanisms of modality-selective attention remain largely uninvestigated. We conducted translaminar recordings in mouse auditory cortex (AC) during an audiovisual (AV) attention-shifting task. Attending to sound elements in an AV stream reduced both pre-stimulus and stimulus-evoked spiking activity, primarily in deep layer neurons. Despite reduced spiking, stimulus decoder accuracy was preserved, suggesting improved sound encoding efficiency. Similarly, task-irrelevant probe stimuli during intertrial intervals evoked fewer spikes without impairing stimulus encoding, indicating that these attention influences generalized beyond training stimuli. Importantly, these spiking reductions predicted trial-to-trial behavioral accuracy during auditory attention, but not visual attention. Together, these findings suggest auditory attention facilitates sound discrimination by filtering sound-irrelevant spiking in AC, and that the deepest cortical layers may serve as a hub for integrating extramodal contextual information.
Methods
This dataset consists of extracellular physiology and behavioral measures recorded while mice performed an audiovisual rule-switching task (see corresponding manuscript). The task is separated into blocks, defined by their modality-specific rule. During the auditory rule (A-rule), mice used sound stimuli (tone clouds) to make decisions, while ignoring simultaneously presented visual stimuli (drifting gratings). During the visual rule (V-rule), mice used visual stimuli while ignoring sounds.
The physiology data comprises single unit spike times and waveforms, recorded using 64 channel linear silicon probes and an Intan acquisition system and spike sorted using KiloSort2 (Stringer et al. 2019). Behavioral lick response during the task was recorded using a photobeam lickometer. Animal locomotion was recorded with an optical sensor measuring movement of a floating treadmill. Pupillometry was used to determine engagement during the task and was measured from videos recorded of the left eye during the task. In between task trials, a random double sweep stimulus (Gourevitch et al. 2015) was used to map spectrotemporal receptive fields.
Data are included only for behavior sessions that met performance criterion (n = 23 from 10 mice): d' sensitivity index of at least 1.5 for both auditory and visual rules and a false alarm rate of <0.5 on the stimuli with conflicting reward values across rules (AUVR in A-rule and ARVU in V-rule).
Usage notes
Data is provided in MAT-File format to be opened with MATLAB or Octave. A ReadMe file (README.txt) is included with full descriptions of variables.