Alpha-Band Phase Modulates Perceptual Sensitivity by Changing Internal Noise and Sensory Tuning

Read the full article See related articles

Discuss this preprint

Start a discussion

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Alpha-band neural oscillations (8–13 Hz) are theorized to phasically inhibit visual processing based, in part, on results showing that pre-stimulus alpha phase predicts detection (i.e., hit rates). However, recent failures to replicate and a lack of a mechanistic understanding regarding how alpha impacts detection have called this theory into question. We recorded EEG while six observers (6,020 trials each) detected near-threshold Gabor targets embedded in noise. Using signal detection theory (SDT) and reverse correlation, we observed an effect of occipital and frontal pre-stimulus alpha phase on sensitivity (d’), not criterion. Hit and false alarm rates were counterphased, consistent with a reduction in internal noise during optimal alpha phases. Perceptual reports were also more consistent when two identical stimuli were presented during the optimal phase, suggesting a decrease in internal noise rather than signal amplification. Classification images revealed sharper spatial frequency and orientation tuning during the optimal alpha phase, implying that alpha phase shapes sensitivity by modulating sensory tuning towards relevant stimulus features.

Article activity feed