Enhancing the study of peripheral vision in virtual reality with improved eye tracking

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This study introduces an enhanced, open-source eye-tracking system integrated into an immersive VR environment. Utilizing the Tobii Pro SDK and Unity, our basic module system allows for precise gaze estimation accuracy that can be deployed in behavioral experiments. We deployed our module in two experiments: Experiment 1 involved passive observation, where participants fixated on stop signs while detecting peripheral targets. Experiment 2 added an active driving component, requiring participants to control the vehicle while performing the detection task. Results showed high fixation success and detection rates, with performance declining systematically for smaller and more eccentric targets, consistent with known peripheral vision constraints. Following these tests, we extended our module to incorporate a novel iris-based ratio method to correct pupil diameter for environmental luminance and headset-induced artifacts. This correction enables reliable, real-time assessment of cognitive load via pupillometry despite VR’s dynamic conditions. Overall, our findings demonstrate that this enhanced eye-tracking system effectively captures gaze behavior and cognitive load simultaneously in ecologically valid VR scenarios. This advancement enables detailed study of attention and workload dynamics during multitasking and offers promising applications for adaptive systems responsive to real-time user states.

Article activity feed