Data for: Probing visual sensitivity and attention in mice using reverse correlation
Data files
Aug 25, 2023 version files 247.61 MB
Abstract
Visual attention allows the brain to evoke behaviors based on the most important visual features. Mouse models offer immense potential to gain a circuit-level understanding of this phenomenon, yet, how mice distribute attention across features and locations is not well understood. Here, we describe a new approach to address this limitation, by training mice to detect weak vertical bars in a background of checkerboard noise while spatial cues manipulated their attention. By adapting a reverse correlation method from human studies, we linked behavioral decisions to stimulus features and locations. We show that mice voluntarily deploy attention to a small rostral region of the visual field. Within this region, mice attended to multiple features (orientation, spatial frequency, contrast) that indicated the presence of weak vertical bars. This attentional tuning grew with training, multiplicatively scaled behavioral sensitivity, approached that of an ideal observer, and resembled the effects of attention in humans. Taken together, we demonstrate that mice can simultaneously attend to multiple features and locations of a visual stimulus.
Methods
Sample dataset to run behavioral reverse correlation as described in Lehnert et al., 2023.
Analysis code can be found at: https://github.com/Swamylab/Lehnert_et_al_Attention
readme file contains a description of the code, variables and other details.
Contact erik.cook@mcgill.ca for questions about the data format and analysis code.