Distilling noise characteristics and prior expectations in multisensory causal inference
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The perception of the external world relies on integrating information from multiple sensory modalities. To do this effectively, the brain must determine whether sensory signals come from a common source and, if so, combine them to reduce perceptual uncertainty. While Bayesian observer models have been successful in accounting for multisensory causal inference decisions by humans, they typically rely on simplifying assumptions that may not reflect the true complexity of human perception. In this study, we challenge two assumptions common in Bayesian multisensory perception models: homoskedastic (constant across space) sensory noise and Gaussian priors. We collected an auditory-visual perceptual dataset featuring both unisensory and bisensory tasks, where participants must either provide stimulus location estimates or same-different source judgments. Subsequently, we developed a flexible semiparametric approach that allowed us to infer the sensory noise and prior shapes from participants' data, and subsequently `distill' them into new model classes through visual inspection of the semiparametrically fitted function shapes. We find that human multisensory perception is best described by an eccentricity-dependent sensory noise that plateaus in the periphery and a prior distribution with a narrow central peak and smoother tails. We also found evidence for auditory range recalibration and increased sensory noise in multisensory conditions, suggesting complex interactions between sensory modalities. These findings deviate substantially from traditional modeling assumptions and highlight the value of making Bayesian cognitive models more data-driven. Overall, our study demonstrates the value of systematically exploring model assumptions in multisensory research and provides a new set of modeling tools for perceptual causal inference.