Temporal dynamics of short-term neural adaptation across human visual cortex
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Neural responses in visual cortex adapt to prolonged and repeated stimuli. While adaptation occurs across the visual cortex, it is unclear how adaptation patterns and computational mechanisms differ across the visual hierarchy. Here we characterize two signatures of short-term neural adaptation in time-varying intracranial electroencephalography (iEEG) data collected while participants viewed naturalistic image categories varying in duration and repetition interval. Ventraland lateral-occipitotemporal cortex exhibit slower and prolonged adaptation to single stimuli and slower recovery from adaptation to repeated stimuli compared to V1-V3. For category-selective electrodes, recovery from adaptation is slower for preferred than non-preferred stimuli. To model neural adaptation we augment our delayed divisive normalization (DN) model by scaling the input strength as a function of stimulus category, enabling the model to accurately predict neural responses across multiple image categories. The model fits suggest that differences in adaptation patterns arise from slower normalization dynamics in higher visual areas interacting with differences in input strength resulting from category selectivity. Our results reveal systematic differences in temporal adaptation of neural population responses across the human visual hierarchy and show that a single computational model of history-dependent normalization dynamics, fit with area-specific parameters, accounts for these differences.
Author summary
Neural responses in visual cortex adapt over time, with reduced responses to prolonged and repeated stimuli. Here, we examine how adaptation patterns differ across the visual hierarchy in neural responses recorded from human visual cortex with high temporal and spatial precision. To identify possible neural computations underlying short-term adaptation, we fit the response time courses using a temporal divisive normalization model. The model accurately predicts prolonged and repeated responses in lower and higher visual areas, and reveals differences in temporal adaptation across the visual hierarchy and stimulus categories. Our model suggests that differences in adaptation patterns result from differences in divisive normalization dynamics. Our findings shed light on how information is integrated in the brain on a millisecond-time scale and offer an intuitive framework to study the emergence of neural dynamics across brain areas and stimuli.