Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila.

14Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Natural vision is dynamic: as an animal moves, its visual input changes dramatically. How can the visual system reliably extract local features from an input dominated by self-generated signals? In Drosophila, diverse local visual features are represented by a group of projection neurons with distinct tuning properties. Here we describe a connectome-based volumetric imaging strategy to measure visually evoked neural activity across this population. We show that local visual features are jointly represented across the population, and that a shared gain factor improves trial-to-trial coding fidelity. A subset of these neurons, tuned to small objects, is modulated by two independent signals associated with self-movement, a motor-related signal and a visual motion signal associated with rotation of the animal. These two inputs adjust the sensitivity of these feature detectors across the locomotor cycle, selectively reducing their gain during saccades and restoring it during intersaccadic intervals. This work reveals a strategy for reliable feature detection during locomotion.

Cite

CITATION STYLE

APA

Turner, M. H., Krieger, A., Pang, M. M., & Clandinin, T. R. (2022). Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila. ELife, 11. https://doi.org/10.7554/eLife.82587

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free