Neural Computational Model Predicts Attentional Dynamics during Immersive Search in Virtual Reality

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Much of the scientific understanding of visual attention has come from desktop paradigms where body, head, and eye movements are restricted. This stands in contrast to the ability to search and navigate with few constraints in the real world. To bridge this gap, a computational model parametrized on a wide range of well-controlled desktop search tasks was evaluated on its ability to predict search behaviors in a less constrained, immersive environment in virtual reality. A set of validation metrics showed that the model’s attention map could predict empirical behaviors such as eye gaze and manual responses in VR despite the field-of-view varying from one moment to another from the participant’s movements. The present work quantifies the real-world applicability of laboratory-based theory and highlights ways to address outstanding limitations in explaining naturalistic visual search behaviors.

Article activity feed