Perirhinal cortex supports object perception by integrating over visuospatial sequences
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Perception unfolds across multiple timescales. For humans and other primates, many object-centric visual attributes can be inferred ‘at a glance’ (i.e., given < 200ms of visual information), an ability supported by ventral temporal cortex (VTC). Other perceptual inferences require more time; to determine a novel object’s identity, we might need to represent its unique configuration of visual features, requiring multiple ‘glances.’ Here we evaluate whether perirhinal cortex (PRC), downstream from VTC, supports object perception by integrating over such visuospatial sequences. We first compare human visual inferences directly to electrophysiological recordings from macaque VTC. While human performance ‘at a glance’ is approximated by a linear readout of VTC, participants radically outperform VTC given longer viewing times (i.e., > 200ms). Next, we leverage a stimulus set that enables us to characterize PRC involvement in these temporally extended visual inferences. We find that human visual inferences ‘at a glance’ resemble the deficits observed in PRC-lesioned human participants. Not surprisingly, by measuring gaze behaviors during these temporally extended viewing periods, we find that participants sequentially sample task-relevant features via multiple saccades/fixations. These patterns of visuospatial attention are both reliable across participants and necessary for PRC-dependent visual inferences. These data reveal complementary neural systems that support visual object perception: VTC provides a rich set of visual features ‘at a glance’, while PRC is able to integrate over the sequential outputs of VTC to support object-level inferences.