Visual and motor signatures of locomotion dynamically shape a population code for feature detection in Drosophila

Curation statements for this article:
  • Curated by eLife

    eLife logo

    eLife assessment

    This manuscript investigates how the fly visual system can encode specific features in the presence of self-generated motion. Using volumetric imaging, it explores the encoding of visual features in population activity in the Drosophila visual glomeruli - a set of visual "feature detectors". Through an elegant combination of neural imaging, visual stimulus manipulations, and behavioral analysis, it demonstrates that two different mechanisms, one based on motor signals and one based on visual input, serve to suppress local features during movements that would corrupt these features. The results of this study open up future directions to determine how motor and visual signals are integrated into visual processing at the level of neural circuits.

This article has been Reviewed by the following groups

Read the full article See related articles

Abstract

Natural vision is dynamic: as an animal moves, its visual input changes dramatically. How can the visual system reliably extract local features from an input dominated by self-generated signals? In Drosophila , diverse local visual features are represented by a group of projection neurons with distinct tuning properties. Here, we describe a connectome-based volumetric imaging strategy to measure visually evoked neural activity across this population. We show that local visual features are jointly represented across the population, and a shared gain factor improves trial-to-trial coding fidelity. A subset of these neurons, tuned to small objects, is modulated by two independent signals associated with self-movement, a motor-related signal, and a visual motion signal associated with rotation of the animal. These two inputs adjust the sensitivity of these feature detectors across the locomotor cycle, selectively reducing their gain during saccades and restoring it during intersaccadic intervals. This work reveals a strategy for reliable feature detection during locomotion.

Article activity feed

  1. eLife assessment

    This manuscript investigates how the fly visual system can encode specific features in the presence of self-generated motion. Using volumetric imaging, it explores the encoding of visual features in population activity in the Drosophila visual glomeruli - a set of visual "feature detectors". Through an elegant combination of neural imaging, visual stimulus manipulations, and behavioral analysis, it demonstrates that two different mechanisms, one based on motor signals and one based on visual input, serve to suppress local features during movements that would corrupt these features. The results of this study open up future directions to determine how motor and visual signals are integrated into visual processing at the level of neural circuits.

  2. Reviewer #1 (Public Review):

    Detecting a small object is challenging, particularly when the animal is moving. This is because self-generated visual motion interferes with visual perception. Turner et al. established a new method to record neural activity simultaneously from multiple populations of local visual feature detecting neurons (or lobula columnar projection neurons (LCs)) by improving conventional calcium imaging with a new pre-synapse restricted fast calcium indicator and careful image alignment. They found that LCs can be categorized into four types depending on their visual feature selectivity. By simultaneously recording from multi-type LCs, the authors found, for the first time, that several LC types covary their activities, which improves visual feature encoding. Then, the authors performed calcium imaging from walking flies and found that the visual responses are generally suppressed during walking, particularly in small object-detecting populations. Some portion of shared activity among LC populations was explained by the walking-related modulation. Similarly, global visual motion, which is expected from naturalistic fly's walking, suppressed responses to local visual features in a motion coherence-dependent manner. The suppressive effect was prominent when the visual motion was fast and contained low spatial frequency components. Finally, visual and walking-related signals independently suppressed neural responses during saccadic events. These enormous pieces of evidence nicely fit the idea that the fly engages in visual feature processing only during straight walks while the visual inputs are effectively shut down during sharp turns when contamination by self-generated visual motion is non-negligible. On the other hand, responses to important visual stimuli, such as looming produced by predators, are maintained in any conditions. The authors provided a comprehensive view of how a visual circuit operates in a natural condition and further strengthened the growing idea shared across species that sensory perception is dynamically structured during movement.

  3. Reviewer #2 (Public Review):

    This study examines the encoding of distinct visual features during self-motion and reveals distinct mechanisms that contribute to the suppression of features that may be corrupted during self-motion - one based on motor output and one based on the resulting visual input. The authors develop an imaging approach to measure neural activity across many glomeruli, which enables analysis in terms of population codes. They first demonstrate that even though movement strongly alters the response in individual glomeruli, a population-based readout is still able to decode stimulus identity. They then demonstrate that this modulation is primarily suppression of glomeruli that respond to local features, while global features (i.e. looming) are unaltered. Finally, through a combination of visual stimulus manipulations that mimic the effect of movement and analysis of responses relative to behavioral epochs, they show that both the visual input and a motor signal contribute to this suppression.

    Together, this provides an elegant explanation of how different signals combine to adapt sensory processing to ongoing behavior. The experiments are cleverly designed and the results are clearly presented, with few technical concerns. The only significant concern entails how well their imaging isolated the visual projection neurons they were targeting.

    This study is likely to have a significant impact as it provides a new view on a timely question in visual neuroscience. The study also opens up clear future directions to determine how these two signals are generated and integrated into visual processing, at the neural circuit level. Finally, it provides intriguing parallels to the impact of eye movements on the mammalian visual system.

  4. Reviewer #3 (Public Review):

    This manuscript presents a nice approach for performing population recordings from the optic glomeruli of Drosophila, allowing for explorations of how visual stimuli are encoded at a population level. The authors use a combination of behavioral recordings and visual perturbations to identify two mechanisms that contribute to the suppression of visual responses during body saccades: one motor-related and one visual. Overall this study presents a nice combination of imaging and analysis to determine mechanisms by which the visual system tunes out signals associated with self-movement to produce a reliable encoding of the visual world. I do have some concerns about the sources of the gain modulation that they describe across the population, and was confused by some aspects of the framing in terms of self-motion and visual feature decoding.