Efficient processing of visual environment necessitates the integration of incoming sensory evidence with concurrent contextual inputs and mnemonic content from our past experiences. To delineate how this integration takes place in the brain, we studied modulations of feedback neural patterns in non-stimulated areas of the early visual cortex in humans (i.e., V1 and V2). Using functional magnetic resonance imaging and multivariate pattern analysis, we show that both, concurrent contextual and time-distant mnemonic information, coexist in V1/V2 as feedback signals. The extent to which mnemonic information is reinstated in V1/V2 depends on whether the information is retrieved episodically or semantically. These results demonstrate that our stream of visual experience contains not just information from the visual surrounding, but also memory-based predictions internally generated in the brain.
Feedback activity in human early visual cortex contains concurrent contextual and time-distant mnemonic information.