Semantic audio-visual congruence modulates visual sensitivity to biological motion across awareness levels
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Whether cross-modal interaction requires conscious awareness of multisensory information or whether it can occur in the absence of awareness, is still an open question. Here, we investigated if sounds can enhance detection sensitivity of semantically matching visual stimuli at varying levels of visual awareness. We presented biological motion stimuli of human actions (walking, rowing, sawing) during dynamic continuous flash suppression (CFS) to 80 participants and measured the effect of co-occurring, semantically matching or non-matching action sounds on visual sensitivity (d’). By individually thresholding stimulus contrast, we distinguished participants who detected motion either above or at chance level.. Participants who reliably detected visual motion above chance showed higher sensitivity to upright versus inverted biological motion across all experimental conditions. In contrast, participants detecting visual motion at chance level (i.e. during successful suppression) demonstrated this upright advantage exclusively during trials with semantically congruent sounds. Across the whole sample the impact of sounds on visual sensitivity increased as participants' visual detection performance decreased, revealing a systematic trade-off between auditory and visual processing. Our findings suggest that semantic congruence between auditory and visual information can selectively modulate biological motion perception when visual awareness is minimal or absent, while more robust visual signals enable perception of biological motion independent of auditory input. Thus, semantically congruent sounds may impact visual representations as a function of the level of visual awareness.