Neural signatures of associational cortex emerge in a goal-directed model of visual search

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Animals actively engage with their environment to gather information, continuously shaping both their sensory input and behavior. Understanding this closed loop between perception and action remains a central challenge in neuroscience. A key example is active vision, where observers decide where to look next, selectively sampling from their visual space to guide ongoing perception and action. However, despite major advances in linking neural activity with behavior and computational modeling of vision under passive viewing conditions, the interactive aspects of natural vision remain underexplored. Visual search, the act of locating a target among distractors, exemplifies this dynamic sampling process and has long served as a core paradigm for studying visual attention. While its behavioral and neural signatures have been characterized in humans and non-human primates, a unifying model that links these neural phenomena to behavior during visual search has been lacking. Here, we present a biologically aligned neural network model trained to perform visual search directly from natural scenes by generating sequences of saccades to locate a target. The model generalizes to novel objects and scenes, produces human-like scanpaths, and recapitulates classic behavioral biases in human visual search. Strikingly, units in the model exhibit neural response properties characteristic of the fronto-parietal network, including a stable cue template in working memory, a retinocentric cue-similarity map, and prospective fixation signals. Beyond reproducing known behavioral and neural phenomena, the model reveals a representational geometry that supports cue-driven prioritization, spatial memory, and planning of future fixations. These results establish a computational framework for studying visual search as an emergent property of goal-directed perception, offering concrete predictions for neurophysiological and behavioral testing, and paving the way toward a unified account of active vision.

Article activity feed