Effects of audiovisual asynchrony on speech intelligibility in typically hearing adults and cochlear implant users
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
PURPOSE: Speech contains auditory cues conveyed by the voice and visual cues produced by articulators, that are commonly integrated to form salient audiovisual percepts. How individuals make use of these cues depends on listener- and stimulus-specific factors. For example, cochlear implant (CI) users rely more on visual cues and the perceptual gains associated with multisensory integration than typically-hearing controls, while both groups are more likely to integrate auditory and visual cues when they occur synchronously. Here, we examine the interaction between these effects by quantifying how asynchrony affects speech intelligibility in CI users and typically-hearing controls.METHODS: Participants were presented with audio-only, visual-only, synchronous audiovisual, and asynchronous audiovisual sentences. Intelligibility was quantified as the proportion of keywords correctly reported. Benefits associated with multisensory integration across offsets were computed by comparing performance to unisensory conditions.RESULTS: Intelligibility and multisensory gain were strongest near synchrony for both groups, tapering off as audiovisual asynchrony increased. CI users showed larger decrements in performance with increasing offset than controls. At the largest audio-leading offset, CI users’ performance continued to decrease while controls improved relative to smaller offsets.CONCLUSION: Typically-hearing listeners benefit from multisensory integration over a broader range of speech offsets than CI users. Interestingly, at extreme offsets CI users appear to be impaired by visual speech cues compared to typically hearing individuals, suggesting they may attempt to use visual speech cues to inform intelligibility even when they occur outside of the range of perceptual integration. These findings align with literature on visual bias in CI users, and raise questions regarding potential compensatory mechanisms for speech intelligibility.