ERP Signatures of Stimulus Choice in Gaze-Independent BCI Communication

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This study aimed to identify electrophysiological markers (event-related potentials, ERPs) of intentional, need-related mental activity under controlled gaze fixation, with potential applications in brain–computer interface (BCI) development for individuals with severe motor impairments. Methods: Using stimuli from the PAIN Pictionary—a pictogram database for non-verbal communication in locked-in syndrome (LIS) contexts—neural responses were recorded via high-density EEG in 30 neurologically healthy adults (25 included after artifact-based exclusion). Participants viewed randomized sequences of pictograms representing ten fundamental need categories (e.g., “I am cold”, “I’m in pain”), with one category designated as the target per sequence. Each pictogram was followed by a visual cue prompting a button press: during training, participants executed the press; during the main task, they performed right-hand motor imagery while maintaining central fixation. Results: ERP analyses revealed a robust P300 response (450–650 ms; p < 0.0002) over centro-parietal regions for target cues, reflecting enhanced attentional allocation and stimulus choice. An early Contingent Negative Variation (CNV, 450–750 ms; p = 0.008) over fronto-lateral sites indicated anticipatory attention and motor preparation, while a left-lateralized late CNV (2250–2750 ms; p = 0.035) appeared to embody the preparation of a finalized motor plan for the forthcoming right-hand imagined response. A centro-parietal P600 component (600–800 ms; p = 0.044) emerged during response monitoring, reflecting evaluative and decisional processes. SwLORETA source analyses localized activity within a distributed network spanning prefrontal, premotor, motor, parietal, and limbic areas. Conclusions: These findings demonstrate that motor imagery alone can modulate pattern-onset ERP components without overt movement or gaze shifts, supporting the translational potential of decoding need-related intentions for thought-driven communication systems in individuals with profound motor impairments.

Article activity feed