The Contextual Affects of Facial Expression

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Affect recognition and communication are critical for everyday social interaction. Traditional emotion research has often assumed that facial expressions reliably reflect internal emotional states and conform to a set of universal expressions. However, emerging evidence suggests that facial expressions are highly individualized and influenced by context. To investigate these factors, we used genetic algorithms to enable participants to generate personalized facial expressions that best matched their subjective interpretation of affect within written contextual scenarios. 12 participants read standardized scenarios drawn from Howard Schatz’s Actors Acting, which had been independently rated (0–4) across 13 primary emotions (Amusement, Anger, Awe, Contempt, Disgust, Embarrassment, Fear, Happiness, Interest, Pride, Sadness, Shame, Surprise). Each primary emotion was paired with all 12 secondary emotions that were either maximized (Condition 1) or minimized (Condition 2). Participants completed 26 such randomly interleaved trials, followed by the last condition (Condition 3) in which they selected faces corresponding directly to each of the 13 emotion words. Faces were generated within the 199-dimensional coefficient space of the Basel Face Database, with a genetic algorithm presenting 12 unique faces across 6 generations per scenario. Faces selected by participants as matching a given scenario were semi-randomly combined to generate 6 offspring per generation, with the remaining 6 faces created randomly to maintain genetic diversity. A 4-way ANOVA (factors: affect, context, generation, participant) on the cosine distances among faces revealed a significant main effect of affect (p =.0061) and a significant interaction between affect and context (p =.0003). These results indicate that distinct facial structures are associated with different emotions, but the representation of a given emotion is context-dependent. This work offers a novel approach to visualizing individual differences in emotion perception and has implications for advancing personalized tools in affective science, including clinical assessment and emotion recognition systems.

Article activity feed