Brain dynamics of mental state attribution during perception of social robot faces

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

The interplay of mind attribution and emotional responses is considered crucial in shaping human trust and acceptance of social robots. Understanding this interplay can help us create the right conditions for successful human-robot social interaction in the service of societal needs. In this study we show that information about robots describing positive, negative or neutral behavior prompts participants (N=90) to attribute mental states to robot faces, modulating impressions of trustworthiness, facial expression and intentionality. These novel findings were replicated in an experiment investigating the underlying dynamics in the human mind and brain. EEG recordings from 30 participants revealed that affective information influenced specific processing stages in the brain associated with basic face perception and more elaborate stimulus evaluation. However, a modulation of fast emotional brain responses, typically found for human faces, was not observed. These findings suggest that neural processing of robot faces alternates between being perceived as mindless machines and intentional agents: people rapidly attribute mental states during perception, literally seeing good or bad intentions in robot faces, but are emotionally less affected than when facing humans. These nuanced insights into the fundamental psychological and neurocognitive processes supporting mind attribution hold potential for informing the design of artificial social agents, improving human-robot social interactions, and guiding policies regarding moral responsibility.

Article activity feed