Combined eye tracking and electroencephalography during referential selection in dyadic interaction

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Understanding language in real-world interaction requires methods that integrate auditory, visual, and neural signals. We present a proof-of-concept study combining mobile electroencephalography (EEG) and eye tracking to investigate referential selection during dyadic communication. Using a simplified version of the director task, pairs of participants engaged in naturalistic object-movement instructions while EEG and gaze data were recorded and synchronized via LabStreamingLayer. This approach allowed us to calculate overlap-corrected, regression-based event-related potentials (rERPs) time-locked to spoken nouns and to participants’ fixations, focusing on the N400/P300 (300–500 ms) and later (500–800 ms) time windows. Our results show that gaze behavior and fixation-related potentials provide crucial information for interpreting language-related ERPs: targets competing with occluded referents elicited stronger P300 effects, suggesting higher attentional demands, while fixation timing systematically modulated neural responses. Contrary to predictions, competitors received little visual attention, indicating that participants prioritized direct task-relevant information. These findings highlight the potential of multimodal data integration for understanding attentional and predictive mechanisms in real-world communication. Importantly, we demonstrate that established analytic techniques, such as artifact subspace reconstruction, independent component analysis, and linear deconvolution for ERP calculation, are applicable to noisy, naturalistic data sets. This study thus provides a methodological framework for linking gaze and neural activity during interactive language processing beyond laboratory constraints. All preprocessing and analysis scripts, together with example data sets, are openly available on OSF (https://osf.io/5ds4z; a modified and optimized Python implementation is also available on github: https://github.com/XlinCLab/DGAME).

Article activity feed