Crossmodal Interaction of Flashes and Beeps Across Time and Number Follows Bayesian Causal Inference

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Multisensory perception requires the brain to dynamically infer causal relationships between sensory inputs across various dimensions, such as temporal and spatial attributes. Traditionally, Bayesian Causal Inference (BCI) models have generally provided a robust framework for understanding sensory processing in unidimensional settings where stimuli across sensory modalities vary along one dimension such as spatial location, or numerosity (Samad et al., 2015). However, real-world sensory processing involves multidimensional cues, where the alignment of information across multiple dimensions influences whether the brain perceives a unified or segregated source. In an effort to investigate sensory processing in more realistic conditions, this study introduces an expanded BCI model that incorporates multidimensional information, specifically numerosity and temporal discrepancies. Using a modified sound-induced flash illusion (SiFI) paradigm with manipulated audiovisual disparities, we tested the performance of the enhanced BCI model. Results showed that integration probability decreased with increasing temporal discrepancies, and our proposed multidimensional BCI model accurately predicts multisensory perception outcomes under the entire range of stimulus conditions. This multidimensional framework extends the BCI model’s applicability, providing deeper insights into the computational mechanisms underlying multisensory processing and offering a foundation for future quantitative studies on naturalistic sensory processing.

Article activity feed