FEATURE-SPECIFIC ANTICIPATORY PROCESSING FADES DURING HUMAN SLEEP

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Imagine you are listening to a familiar song on the radio. As the melody and rhythm unfold, you can often anticipate the next note or beat, even before it plays. This ability demonstrates the brain’s capacity to extract statistical regularities from sensory input and to generate predictions about future sensory events. It is considered automatic, requiring no conscious effort or attentional resources (1–4). But to what extent does this predictive ability operate when our attention is greatly reduced, such as during sleep? Experimental findings from animal and human studies reveal a complex picture of how the brain engages in predictive processing during sleep (5–13). Although evidence suggests that the brain differentially reacts to unexpected stimuli and rhythmic music (5,7,13), there is a notable disruption in feedback processing, which is essential for generating accurate predictions of upcoming stimuli (10). Here, for the first time, we examine the brain’s ability during sleep to predict or pre-activate low-level features of expected stimuli before presentation. We use sequences of predictable or unpredictable/random tones in a passive-listening paradigm while recording simultaneous electroencephalography (EEG) and magnetoencephalography (MEG) during wakefulness and sleep. We found that during wakefulness, N1 sleep and N2 sleep, subtle changes in tone frequencies elicit unique/distinct neural activations. However, these activations are less distinct and less sustained during sleep than during wakefulness. Critically, replicating previous work in wakefulness (4), we find evidence that neural activations specific to the anticipated tone occur before its presentation. Extending previous findings, we show that such predictive neural patterns fade as individuals fall into sleep.

In Brief

The extent to which predictive processing takes place in sleep is yet to be determined. Using a passive-listening EEG/MEG paradigm, Topalidis et al. show that auditory representations in sleep are brief and unstable, easily overwritten by subsequent inputs, which possibly hinders the tracking and extraction of sensory associations.

Highlights

  • Participants passively listened to random and predictable sequences of tones during both wakefulness and sleep, without being made aware of the underlying pattern.

  • The brain reta

  • ins the ability to process basic low-level features during sleep.

  • While these feature-specific responses are preserved during sleep, they are less distinct and sustained than in wakefulness.

  • Unlike in wakefulness, during sleep, the brain does not predict or anticipate upcoming sounds, despite continuing to process basic auditory information.

Article activity feed