Object representations drive emotion schemas across a large and diverse set of daily-life scenes
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The rapid emotional evaluation of objects and events is essential in daily life. While visual scenes reliably evoke emotions, it remains unclear whether emotion schemas evoked by daily-life scenes depend on object processing systems or are extracted independently. To explore this, we collected emotion ratings for 4913 daily-life scenes from 300 participants, and predicted these ratings from representations in deep neural networks and fMRI activity patterns in visual cortex. AlexNet, an object-based model, outperformed EmoNet, an emotion-based model, in predicting emotion ratings for everyday scenes, while EmoNet excelled for explicitly evocative stimuli. Emotion information was processed hierarchically within the object recognition system, consistent with the visual cortex’s organization. Activity patterns in the lateral occipital complex (LOC), an object-selective region, reliably predicted emotion ratings and outperformed other visual regions. These findings suggest that emotion processing in everyday scenes follows visual object recognition, with additional mechanisms engaged when object content is uninformative.