Improved sensory representations as a result of temporal adaptation: neural and computational evidence
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Human perception is robust under challenging conditions, for example when sensory inputs change over time. Temporal adaptation in the form of reduced responses to repeated external stimuli is ubiquitously observed in the brain, yet it remains unclear how repetition suppression aids recognition of novel inputs. To clarify this, we collected behavioural and electrocorticography (EEG) measurements while human participants categorized objects embedded in visual noise patterns after first viewing these patterns in isolation, inducing adaptation to the noise stimulus. We furthermore manipulated the availability of object information in the visual input by varying the contrast of the noise-embedded objects. Our results provide convergent behavioral, neural and computational evidence of a benefit of temporal adaptation on sensory representations. Adapting to a noise pattern resulted in overall faster object recognition and better recognition of objects as object contrast increased. These adaptation-induced behavioral improvements were accompanied by more pronounced contrast-dependent modulation of object-evoked EEG responses, and better decoding of object information from EEG activity. To identify potential neural computations mediating the benefits of temporal adaptation on object recognition, we equipped task-optimized deep convolutional neural networks (DCNNs) with different candidate mechanisms to adjust network activations over time. DCNNs with intrinsic adaptation mechanisms, such as additive suppression, best matched contrast-dependent human performance benefits due to adaptation, whilst also showing improved object decoding as a result of adaptation. Networks that use temporal divisive normalization, a biologically-plausible canonical neural computation, additionally showed robustness to shifting objects, suggesting that temporal adaptation via divisive normalization aids stable representations of time-varying visual inputs. Overall, our results demonstrate how temporal adaptation improves sensory representations and identify candidate neural computations mediating these effects.