Concurrent contextual and time-distant mnemonic information co-exist as feedback in the human visual cortex
Javier Ortiz-Tudela, Johanna Bergmann, Matthew Bennett, Isabelle Ehrlich, Lars Muckli, Yee Lee Shing
Efficient processing of the visual environment necessitates the integration of incoming sensory evidence with concurrent contextual inputs and mnemonic content from our past experiences. To examine how this integration takes place in the brain, we isolated different types of feedback signals from the neural patterns of non-stimulated areas of the early visual cortex in humans (i.e., V1 and V2). Using multivariate pattern analysis, we showed that both contextual and time-distant information, coexist in V1 and V2 as feedback signals. In addition, we found that the extent to which mnemonic information is reinstated in V1 and V2 depends on whether the information is retrieved episodically or semantically. Critically, this reinstatement was independent on the retrieval route in the object-selective cortex. These results demonstrate that our early visual processing contains not just direct and indirect information from the visual surrounding, but also memory-based predictions.
NeuroImage. 265:119778 (2023)