A Common Representational Code for Event and Object Concepts in the Brain

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Events and objects are two fundamental ways in which humans conceptualize their experience of the world. Despite the significance of this distinction for human cognition, it remains unclear whether the neural representations of object and event concepts are categorically distinct or, instead, can be explained in terms of a shared representational code. We investigated this question by analyzing fMRI data acquired from human participants (males and females) while they rated their familiarity with the meanings of individual words (all nouns) denoting object and event concepts. Multivoxel pattern analyses indicated that both categories of lexical concepts are represented in overlapping fashion throughout the association cortex, even in the areas that showed the strongest selectivity for one or the other type in univariate contrasts. Crucially, in these areas, a feature-based model trained on neural responses to individual event concepts successfully decoded object concepts from their corresponding activation patterns (and vice versa), showing that these two categories share a common representational code. This code was effectively modeled by a set of experiential feature ratings, which also accounted for the mean activation differences between these two categories. These results indicate that neuroanatomical dissociations between events and objects emerge from quantitative differences in the cortical distribution of more fundamental features of experience. Characterizing this representational code is an important step in the development of theory-driven brain-computer interface technologies capable of decoding conceptual content directly from brain activity.

Significance Statement

We investigated how word meaning is encoded in the brain by examining the neural representations of individual lexical concepts from two distinct categories—objects and events. We found that both kinds of concepts were encoded in neural activity patterns in terms of a shared representational space characterized by different modalities of perceptual and emotional experience. This indicates that individual concepts from a wide variety of semantic categories can, at least in principle, be decoded from neural activity using a generative model of concept representation based on interpretable semantic features. Furthermore, both object and event concepts could be decoded from cortical regions previously hypothesized to encode category-specific representations, suggesting that the two categories are jointly represented in these areas.

Article activity feed