Eye Movements in Silent Visual Speech track Unheard Acoustic Signals and Relate to Hearing Experience
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Behavioral and neuroscientific studies have shown that watching a speaker’s lip movements aids speech comprehension. Intriguingly, even when videos of speakers are presented silently, various cortical regions track auditory features, such as the envelope. Recently, we demonstrated that eye movements track low-level acoustic information when attentively listening to speech. In this study we investigated whether ocular speech tracking occurs during visual speech and how it influences cortical silent speech tracking. Furthermore, we compared the data of hearing individuals with congenitally deaf individuals, and those with acquired deafness or hearing loss (DHH; Deaf or hard of hearing) to assess how auditory deprivation (early vs. late onset) affects neural and ocular speech tracking during silent lip-reading. Using magnetoencephalography (MEG), we examined ocular and neural speech tracking of 75 participants observing silent videos of a speaker played forward and backward. Our main finding is a clear ocular unheard speech tracking effect with a dominance <1 Hz, which was not present for the lip movements. Similarly, we observed a <=1.3 Hz effect of neural unheard speech tracking in temporal regions for hearing participants. Importantly, neural tracking was not directly linked to ocular tracking in this study. Strikingly, across different listening groups, deaf participants with auditory listening experience showed higher ocular speech tracking than hearing participants, while no ocular speech tracking effect was revealed for congenitally deaf participants in a very small sample. This study extends our previous work by demonstrating the involvement of eye movements in speech processing, even in the absence of acoustic input.