Artificial emotional introspection improves learning for facial emotion recognition

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

While facial emotion recognition (FER) systems have advanced significantly, they remain constrained by conventional training paradigms that rely solely on low-level optimization signals, without any mechanism for the model to reflect on or adaptively respond to its own learning experience. Here, we integrate the I-Center, a computational framework for artificial introspection, into the training pipeline of FER models. The I‑Center translates real‑time training metrics such as loss, gradient flow, inference time, and prediction confidence into an emotional feature vector grounded in the psychological valence–arousal emotional model, providing the network with a continuous introspective narrative of its own operational state in a psychologically defined format. Across several architectures, including standard convolutional neural networks and attention‑augmented ResNet, emotionally enhanced models significantly outperformed their baseline counterparts on the FER‑2013 dataset. These results show that psychologically grounded, emotionally informed introspection can enhance learning and generalization in FER, moving beyond AI that merely recognizes emotion toward systems that learn with an integrated awareness of their own cognitive emotional state. This work bridges machine introspection with affective computing, offering a pathway toward more transparent, adaptive, and psychologically plausible human‑AI interaction.

Article activity feed