Confidence judgements from fused multisensory percepts

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Distinguishing between reliable and unreliable internal sensory representations is crucial for a successful interaction with the environment. Precise estimates of the uncertainty of sensory representations are critical to optimally integrate multiple sensory modalities and produce a coherent interpretation of the world. However, once a multisensory percept is produced, it is not yet known whether humans still have access to the uncertainty of each sensory modality. Here, we asked human participants to perform a series of temporal bisection tasks, in either unimodal visual or auditory conditions, or in bimodal audiovisual conditions where vision and audition could be congruent or not. The validity of their temporal bisection was assessed by asking participants to choose which of two consecutive decisions they felt more confident of being correct, focussing only on the visual modality. We found that once multisensory information is integrated, participants could no longer access the unisensory information to evaluate the validity of their decisions. Comparing three generative models of confidence, we show that confidence judgments are fooled by the fused bimodal percept. These results highlight that some critical information is lost between perceptual and metaperceptual stages of processing in the human brain.

Article activity feed