A dynamic causal inference framework for perception–action loops
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Causal inference, the process of inferring the causes of our sensory input, is central to multisensory perception. While most computational models of causal inference focus on static perceptual tasks with no temporal or motor components, real-world behavior unfolds dynamically and often involves closed-loop control. Here, we introduce and validate a general modeling framework that unifies multisensory causal inference with optimal feedback control by casting the problem as inference and action in a switching linear dynamical system (SLDS). Our framework combines approximate inference over latent causal structures with a mixture-of-controllers approach, in which each controller is optimized for a specific causal model and weighted by the current belief in that model. We show that classical static models of multisensory perception are special cases of our framework, and extend them to dynamic, action-oriented settings where inference and control evolve over time. Using simulations of previously published behavioral tasks, including visuo-vestibular heading discrimination and path integration with interception, we show that the model reproduces key empirical findings, such as S-shaped biases in heading estimation and motor strategies based on inferred object motion. We also demonstrate the versatility of our approach in novel task extensions, including motion in depth, latent switching dynamics, and multi-objective motor control. Together, our results provide a principled, general-purpose computational account of how causal inference and motor behavior interact in dynamic, uncertain environments, offering a bridge between theories of perception and action.
Author summary
When we interact with the world, our brain constantly combines information from different senses — like visual and vestibular sensory input - to determine what is happening and guide an appropriate response. However, these signals do not always originate from the same source in the world. For example, a moving object might not match the background movement we feel during self-motion. In such situations, the brain faces the challenge of “causal inference”: do these signal arise from a common source or from separate events.
Most previous research has studied this problem in simplified, perceptual settings. But in everyday life, we move through the world and make decisions in real time based on noisy and often conflicting sensory input. In this work, we develop and validate a new computational framework that combines causal inference with models of movement control. This approach allows us to simulate more realistic situations where perception and action interact continuously. We show that our model not only reproduces results from earlier experiments but also makes predictions about how people behave in more complex tasks, like intercepting moving targets or navigating uncertain environments. Our framework offers a new way to study how the brain integrates sensory information to guide behavior in the real world.