Evaluating perceptual evidence of cross-modal plasticity in older adults with hearing loss using an audio–visual word mismatch task
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Older adults with age-related hearing loss (OAHL) are more strongly influenced by vision during audiovisual perception tasks compared to typical-hearing old adults, suggesting a greater weighting of visual input in the former group. This visual “bias” has been argued to arise through compensatory, top-down cross-modal neuroplasticity that strengthens visual perceptual processing in response to hearing loss. However, current behavioural evidence does not convincingly demonstrate that these neural changes confer a visual benefit, as existing methods do not untangle top-down influences of visual processing from low-level audiovisual integration. Here, we tested for evidence of enhanced visual speech processing in OAHL using a cross-modal lexical matching task designed to bypass low-level integration and isolate higher-level visual influences. Forty-nine adults aged 59–81 years completed a task where auditory-only and visual-only words were presented separately and serially in either audio-then-visual (A–V) or visual-then-audio (V–A) order under no noise, 0 dB speech-to-noise ratio (SNR), and −10 dB SNR conditions. Half of the trials contained mismatching word pairs. Participants judged whether audio and visual words matched or mismatched while eye-tracking recorded visual attention to the speaker’s face. OAHL were expected to benefit more from the V–A order if visual speech provides stronger predictive influence. We found that mismatch detection was stronger in the A–V relative to V–A condition in no-noise and 0 dB SNR conditions, and there was no difference at −10 dB SNR. More hearing loss was associated with worse overall perceptual sensitivity and, contrary to our predictions, did not enhance performance in the V–A condition. Hearing loss was not associated with gaze differences to the speaker’s eyes and mouth. Overall, these findings do not support that the visual bias in OAHL arises through high-level visual speech processing.