Cortical oscillations predict auditory grouping in listeners with and without hearing loss
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Auditory grouping relies on the ability to bind tones with coherent spectral features over time to form auditory objects. Sensorineural hearing loss (SNHL) degrades spectral resolution, and the extent of this degradation varies with the listening configuration. However, it remains unclear how SNHL impacts auditory grouping and whether different listening configurations affect this ability. This study investigated task performance and cortical activity during auditory object detection in four groups with different listening configurations: Twenty normal-hearing (NH) listeners, seventeen bilateral hearing aid users with acoustic-only stimulation (A-only), thirty-one cochlear implant (CI) users with acoustic and electric stimulation (A+E), and seventeen bilateral CI users with electric-only stimulation (E-only). While electroencephalography was recorded, participants performed a stochastic figure-ground task requiring the detection of spectrally and temporally coherent tone pips embedded in a background of random-frequency tone clouds. All groups achieved above 80% accuracy, though CI groups showed poorer performance compared to NH and A-only groups. Compared to NH listeners, the object-related evoked responses were weaker in A-only listeners and absent in CI groups. Delta (2-3.5Hz) and theta (4-7Hz) event-related synchronization (ERS) to the auditory objects were only observed in the NH group, except for the A+E group, which showed a delta ERS. However, all groups exhibited alpha (8-15Hz) and beta (17-30Hz) event-related desynchronization (ERD), with no significant group differences. Notably, individual differences in alpha and beta ERD predicted task accuracy. These findings suggest that alpha and beta cortical activity, measured during an auditory object detection task, reflects auditory grouping in any listening configuration.