Mapping concept and relational semantic representation in the brain using large language models

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

How the brain organizes semantic information is one of the most challenging and expansive questions in cognitive neuroscience. To shed light on this issue, prior studies have attempted to decode how the brain represents concepts. We instead examined how relational information is encoded, which we pursued by submitting texts to a contemporary large language model and extracting relational embeddings from the model. Using behavioral data (N = 636), we found these embeddings capture independent information about scenes and objects, along with relational information on their semantic links. Turning to fMRI data (N = 60), we leveraged these embeddings for representational similarity analysis: The occipitotemporal cortex represents concepts in isolation, whereas the dorsolateral prefrontal cortex and basal ganglia principally encode relational information. Relational coding within prefrontal and striatal areas also tracks how participants reason about scenes and objects. Altogether, this research maps how information progresses from concept-level to integrative forms and how this translates into behavior.

Article activity feed