The EXPERIENCE system: integrating virtual reality and multi-modal explainable artificial intelligence for enhanced depression recognition

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Depression is a complex and heterogeneous psychiatric condition, characterized by a broad spectrum of symptoms that vary significantly among individuals. Traditional screening and diagnostic approaches, which primarily rely on self-reported questionnaires and clinical interviews, often fail to fully capture this complexity, resulting in inconsistent diagnoses and variable treatment outcomes.This study investigates the potential of integrating Virtual Reality (VR) and Explainable Artificial Intelligence (XAI) to improve the accuracy and reliability of depression screening. A VR-based serious game was developed to assess a range of behavioral, cognitive, meta-cognitive, and physiological indicators to determine its effectiveness in identifying individuals with depressive symptoms. The study sample consisted of 100 participants, including 50 individuals with depressive symptoms (DS) and 50 healthy controls (HC), classified using the Patient Health Questionnaire-9 (PHQ-9). Multi-modal data collected within the VR environment were analyzed using the CatBoost machine learning algorithm to classify participants as DS or HC. Results demonstrated a mean classification accuracy of 72%, with the best performance reaching 80%. Feature importance analysis indicated that the model’s decision-making process relied on a combination of physiological, behavioral, and meta-cognitive measures. These findings underscore the value of incorporating multi-modal data into depression assessments and suggest that integrating VR and XAI could pave the way for more effective and personalized approaches to screening, diagnosis, and treatment of depression.

Article activity feed