Neural computations of visual, semantic, and memorability features in the human brain

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Object recognition requires integrated processing that extends beyond the visual cortex, incorporating semantic and memory-related processes. However, it remains unclear how different attributes, such as visual, semantic, and memorability features, are encoded and interact during perception. Here, we recorded intracranial electroencephalography from 5143 channels while participants viewed natural object images. We systematically characterized the spatiotemporal patterns of neural encoding for visual, semantic, and memorability attributes and showed that memorability was encoded in a distributed manner, which can be dissociated from visual and semantic coding. While the ventral temporal cortex (VTC) was engaged in encoding all three attributes, the representations were dissociable. Interestingly, memorability representations in the prefrontal cortex appeared to arise from integrated visual and semantic signals from the VTC; and memorability influenced early stages of visual and semantic processing. Our results were corroborated by high-resolution 7T fMRI, which revealed continuous encoding across the brain, and further validated using a separate dataset featuring within-category object variability. Lastly, single-neuron recordings confirmed semantic and memorability coding in the medial temporal lobe. Together, these findings provide a comprehensive view of how visual, semantic, and memorability attributes are dynamically encoded across the brain, highlighting the complex interplay between these attributes that collectively shape object recognition and memory formation.

Article activity feed