What did you said? Differential impacts of acoustic challenge on semantic, syntactic, and prediction-related ERPs during speech processing

Read the full article

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Purpose: When listening to acoustically challenging speech, listeners may strategically shift neural resources across multiple language-related cognitive processes in order to compensate for the additional demands. In the current study, we examined the impact of acoustic challenge (via speech-shaped background noise) on three language-related event-related brain potentials (ERPs): the N400 (linked to semantic retrieval), the P600 (linked to syntactic integration) and a late frontal response (linked to predictive processing).Method: Young normal-hearing adults (N = 48) listened in quiet and in noise to contextually constraining sentences that ended in an expected word, an unexpected word, or a morphosyntactic violation of the expected word while EEG data were collected.Results: Replicating prior work, we showed that an N400 expectancy effect was both reduced in amplitude and delayed in onset latency in noise. We found that a P600 syntax effect was reduced in amplitude by the presence of background noise. We additionally observed that the P600 response was only present for trials in which the listener successfully identified syntactic violations (irrespective of noise). In contrast to the reductions in the N400 and P600, the late frontal response was not reduced in noise.Conclusions: Collectively, this suggests that while semantic retrieval and syntactic integration may be impaired by acoustic challenge, listeners may still prioritize neural resources towards prediction-related processes to help compensate for increased perceptual demands.

Article activity feed