The representation of facial emotion expands from sensory to prefrontal cortex with development

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Facial expression recognition develops rapidly during infancy and improves from childhood to adulthood. As a critical component of social communication, this skill enables individuals to interpret others’ emotions and intentions. However, the brain mechanisms driving the development of this skill remain largely unclear due to the difficulty of obtaining data with both high spatial and temporal resolution from young children. By analyzing intracranial EEG data collected from childhood (5-10 years old) and post-childhood groups (13-55 years old), we find differential involvement of high-level brain areas in processing facial expression information. For the post-childhood group, both the posterior superior temporal cortex (pSTC) and the dorsolateral prefrontal cortex (DLPFC) encode facial emotion features from a high-dimensional, continuous space. However, in children, the facial expression information is only significantly represented in the pSTC, not in the DLPFC. Further, the encoding of complex emotions in pSTC is shown to increase with age. Taken together, these data suggest that young children rely more on low-level sensory areas than on the prefrontal cortex for facial emotion processing, leading us to hypothesize that top-down modulation from prefrontal cortex to pSTC gradually matures during development to enable a full understanding of facial emotions, especially complex emotions which need social and life experience to comprehend.

Article activity feed