A compact multisensory representation of self-motion is sufficient for computing an external world variable

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

External forces shape navigation, but cannot be directly measured by an animal in motion. How the brain integrates multi-modal cues to estimate external forces remains unclear. Here we investigated the representation of multi-modal self-motion cues across columnar inputs to the fly navigation center, known as PFNs. We find that one type integrates optic flow and airflow direction signals with distinct dynamics. We reveal airspeed encoding by a different type. Based on these data, we construct and validate models of how multi-sensory dynamics are encoded across PFNs, allowing us to simulate neural responses during rapid flight maneuvers. Applying a nonlinear observability analysis to these responses, we show that PFN representations during active maneuvers are sufficient to decode the direction of an external force (wind) during free flight. Our work provides evidence that active sensation, combined with multisensory encoding, can allow a compact nervous system to infer a property of the external world that cannot be directly measured by a single sensory system.

Article activity feed