Ocular speech tracking persists in blindness, but its dynamics and oculo-cerebral connectivity depend on visual status
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
While eye movements have been shown to track the speech envelope, it is unknown whether this reflects a hard-wired mechanism or one shaped by (lifetime) audiovisual experience. Further, questions remain about whether ocular tracking is modulated by speech intelligibility and which brain regions drive these synchronized eye movements. Here, we investigate ocular speech tracking in blindfolded early blind, late blind, and sighted individuals using magnetoencephalography (MEG) and source-reconstructed oculomotor signals while participants listened to narrative speech of varying intelligibility. We found that oculomotor activity tracks acoustic speech features and, unlike neural speech tracking, is not modulated by intelligibility. Interestingly, we found effects reflected in two frequency-specific components: a low-frequency (∼1 Hz) effect present across all groups, indicating that visual experience is not required, and a high-frequency (∼6 Hz) effect reduced in early- and late-blind individuals. Moreover, this finding is not driven by cerebro-ocular connectivity, as late-blind individuals exhibit stronger connectivity between the eyes and the left temporal cortices without a corresponding increase in ocular tracking. In conclusion, ocular speech tracking seems to respond selectively to acoustic but not to intelligibility features of speech, and it does not require visual experience to develop. It may thus represent a hard-wired oculomotor mechanism within the oculo-cerebral network involved in speech processing.
Significance Statement
Eye movements provide a unique window into the interaction between auditory and visual systems. By studying early blind, late blind, and sighted individuals, we demonstrate that speech-related eye movements arise from at least two distinct mechanisms: a low-frequency component that occurs independently of (lifetime) visual experience and is linked to processing of acoustic speech features, and a high-frequency component shaped by prior visual exposure. Importantly, speech intelligibility - unlike its impact on neural measures - does not modulate these ocular responses. This dissociation suggests that eye movements reflect mechanisms of spoken language processing that are independent of intelligibility, thereby revealing novel pathways of auditory-motor coupling and broadening our understanding of sensory integration in the absence of vision.