Cross-task, explainable, and real-time decoding of human emotion states by integrating grey and white matter intracranial neural activity
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Decoding human emotion states from neural activity has significant applications in human-computer interfaces, psychiatric diagnostics, and neuromodulation therapies. Intracranial electroencephalogram (iEEG) provides a promising modality for decoding by balancing temporal, spatial, and noise resolutions. However, real-world application of decoding requires high performance that integrates neural activity from both grey and white matter, stable generalization across different contexts, sufficient neural encoding explainability, and robust real-time implementation, all of which remain elusive. Here, we simultaneously recorded iEEG and abundant self-rated valence and arousal scores—measuring the two primitive dimension of emotion—across two emotion-eliciting tasks in eighteen epilepsy subjects. We developed self-supervised deep learning models that achieved high-performance decoding by integrating grey and white matter signals, generalizing across tasks. The models provided strong explainability by revealing shared and preferred mesolimbic-thalamo-cortical subnetworks encoding valence and arousal, as well as the structural connectivity basis underlying grey and white matter integration. Finally, the models were implemented online and realized robust real-time decoding in four new subjects. Our results have implications for advancing emotion decoding neurotechnology and understanding emotion encoding mechanisms.