Behavioral and Modeling Evidence that Eye Movements Bias Self-motion Perception

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

To navigate the world, humans must integrate what they see with how they move. As the body moves, for example, the eyes rotate to explore the environment; these eye rotations, in turn, alter the visual signals used to judge body motion. Yet the practical impact of gaze dynamics on self-motion perception remains poorly understood. We tested how gaze position and gaze velocity shape self-motion perception—specifically heading direction—from visual signals in two behavioral experiments. In Experiment 1, we directly manipulated gaze position and velocity; in Experiment 2, a range of task demands evoked distinct gaze patterns. Across both experiments, heading estimates showed systematic, gaze-dependent errors: these estimates shifted toward the direction of eye motion and grew with horizontal gaze eccentricity and speed. A Bayesian ideal observer reproduced these error patterns across participants and tasks when it included three known features of visuomotor processing: (i) encoding retinal motion with eccentricity-dependent noise, (ii) underestimating eye-rotation speed, and (iii) a prior for moving straight ahead. These results reveal a lawful coupling between oculomotor behavior and heading perception. They further suggest that natural gaze strategies, such as keeping gaze near the optic-flow singularity and limiting pursuit speed, help mitigate gaze-dependent biases during navigation.

Article activity feed