Predictive gaze orienting during navigation in virtual reality

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Natural vision is an active, predictive process guided by expectations about when and where information will appear. Yet how gaze is shaped in dynamic, multisensory environments remains poorly understood. Using immersive virtual reality with eye-tracking, we examined oculomotor behavior during naturalistic navigation. Participants cycled through a virtual city while avatar cyclists, first heard as overtaking them from behind via spatialized auditory cues, later became visible as they passed. Auditory cues triggered anticipatory gaze shifts to expected locations, indicating that eye movements were guided by auditory predictions rather than reactive visual responses. Violations of auditory–spatial expectations elicited longer fixations. Critically, removing auditory cues impaired predictive gaze orienting, delayed gaze orienting and increased collisions with obstacles. These findings demonstrate that auditory input fundamentally shapes predictive models guiding visual exploration and adaptive behavior in dynamic environments, underscoring the multisensory basis of active perception in real-world interactions.

Article activity feed