Neural Representations of Perceived Engagement during Action Observation
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Interpersonal motor interactions are central to social life, yet it remains unclear how social cues relevant to detecting engagement are encoded in the brain. Recent evidence suggests that regions traditionally associated with mentalizing, such as the dorsomedial prefrontal cortex (dmPFC) and temporo-parietal junction (TPJ), co-activate with nodes of the Action Observation Network (AON) during motor engagement with others, pointing to a synergistic role in the processing of action features during interaction. Using fMRI and Representational Similarity Analysis (RSA), we examined brain responses to reach-to-grasp actions varying in Goal (passing vs. placing), Perspective (2nd vs. 3rd person), and Gaze visibility, creating a gradient of perceived engagement. Our results show that the TPJ and premotor cortices, but not the dmPFC, displayed convergent neural geometry during action observation. Model- and cluster-based analyses further linked TPJ and AON regions in multiple representational spaces: left-hemisphere regions aligned with goal encoding, whereas right-hemisphere regions aligned with an Engagement model capturing first-person interaction. Critically, we demonstrate that the rTPJ and left premotor cortex share representational geometry for action perspective, directly coupling mentalizing and sensorimotor systems in the encoding of directional cues. By contrast, the dmPFC appeared representationally isolated from both TPJ and AON regions, with response patterns showing no similarity to Goal or Engagement models. This is consistent with accounts of its recruitment under richer interactive demands rather than selectivity for abstract action features. Together, these findings support a distributed, sensorimotor account of engagement encoding and reveal new functional links among key social cognition areas.