Gaze dynamics prior to navigation support hierarchical planning
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The task of planning future actions in the context of an uncertain world results in massive state spaces that preclude exhaustive search and other strategies explored in the domains of both human decision-making and computational agents. One plausible solution to this dimensionality explosion is to decompose the task into subgoals that match the information geometry of the task at hand. However, how individuals identify a productive hierarchy, and perceive and select subgoals suitable to planning, is not well understood. To investigate this topic, we designed a virtual-reality based behavioral experiment which collected eye movements during a pre-navigation planning phase. By capturing gaze dynamics correlated with the simulative processes used in planning, we were able to identify the spatiotemporal evolution of visual search under uncertainty. Our results highlight gaze dynamics indicative of a search process that exhibits hierarchical structure. These include a decreasing trend seen in gaze distance from origin and a broad to narrow shift (with reducing saccade distances and longer fixation durations) as plans are established. In line with prior work, critical tiles to which landscape connectivity is most sensitive were the strongest predictors of visual attention. We also find that deeper planning was correlated with success only on the most complex maps (e.g. those with a larger number of information-nodes, higher branching factor, and more forks, according to an info-graphical map analysis). This study highlights the role of embodied visual search during planning, and the skill-dependence of the specific subgoals and hierarchical decomposition used which unlocked successful performance.