Taking the eye-tracker out to dinner: characterizing spatial attention biases during an everyday behavior using computer vision
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Visual exploration during everyday tasks reveals attentional processes and offers promising avenues for clinical assessment. In this study, we examined whether the spatial attention bias induced by the presence of a mobile phone during a routine activity, eating dinner, can be effectively captured using wearable sensors that record gaze and body orientation. In a within-subject design, participants ate spaghetti while their mobile phone was either absent or placed on the left or right side of their tray. Our analyses focused on deviations in gaze and body orientation from the center of the plate and fixations on target objects automatically extracted through computer vision. Phone placement shifted gaze toward its location, producing a clear lateralization throughout the meal: without a phone, gaze centered normally; with a phone, participants fixated more on nearby objects and less on those opposite. These results demonstrate that wearable eye‑tracking can detect spatial attention biases in natural behavior. Integrating computer vision enabled automatic contextualization of gaze data, allowing for the extraction of meaningful features related to specific elements of the visual environment. This scalable, non-invasive, and ecologically valid approach holds promise for assessing attentional dynamics in real-world context.