Skip to main content
Dryad

Raw data from: Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila, part 3

Data files

Nov 03, 2022 version files 157.64 GB

Abstract

Natural vision is dynamic: as an animal moves, its visual input changes dramatically. How can the visual system reliably extract local features from an input dominated by self-generated signals? In Drosophila, diverse local visual features are represented by a group of projection neurons with distinct tuning properties. Here we describe a connectome-based volumetric imaging strategy to measure visually evoked neural activity across this population. We show that local visual features are jointly represented across the population, and that a shared gain factor improves trial-to-trial coding fidelity. A subset of these neurons, tuned to small objects, is modulated by two independent signals associated with self-movement, a motor-related signal and a visual motion signal. These two inputs adjust the sensitivity of these feature detectors across the locomotor cycle, selectively reducing their gain during saccades and restoring it during intersaccadic intervals. This work reveals a strategy for reliable feature detection during locomotion. This dataset contains GCaMP6f and syt1GCaMP6f calcium responses from identified optic glomeruli in the Drosophila central brain, as well as walking behavior tracking data. It accompanies the manuscript Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila, by MH Turner et al.