Quantifying Dynamic Facial Motion Using Parametrically Controlled Photorealistic Avatars
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Facial expressions are central to human communication and can be systematically described through the Facial Action Coding System (FACS), which decomposes expressions into discrete facial movements, or Action Units (AUs). Open-source automated AU detection systems such as AFAR and OpenFace are now widely used in psychological research due to their efficiency and accessibility, yet their performance and potential biases remain insufficiently characterized. Here, we leverage photorealistic, parameterized MetaHuman animations to create a highly controlled yet naturalistic stimulus set in which facial motion is precisely specified across time and held constant across identities. Four avatars differing in sex, age, and ethnicity were animated with identical motion parameters to generate a set of single-AU and AU-combination sequences, enabling systematic evaluation of model performance under tightly controlled conditions. Using AFAR and OpenFace, we assessed AU detection accuracy and temporal dynamics through classification metrics and cross-correlation analyses. Although both systems performed above chance, their accuracy varied substantially across AU type and avatar. In particular, single-AU movements were detected less reliably than AU combinations, and both systems showed pronounced biases toward smiling-related AUs. To address these limitations, we introduce a landmark-displacement analysis that directly quantifies non-rigid facial motion from tracked geometry, yielding consistent and accurate motion estimates across avatars and animations. Together, our findings reveal systematic biases in current AU detection systems and show how parametrically controlled avatar stimuli, combined with motion-based analyses, provide a powerful framework for probing dynamic face perception.