An Encoder-Only Transformer Model for Depression Detection from Social Network Data: The DEENT Approach

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Depression is a common mental illness magnified by the COVID-19 pandemic. In this context, early depression detection is pivotal for public health systems. Various works have addressed depression detection in social network data. Nevertheless, they used data from before the pandemic and did not exploit Transformers’ architecture capabilities for realizing binary classification on extensive dimensional data. This paper aims to introduce a model based on encoder-only Transformer architecture to detect depression using a large Twitter dataset collected during the COVID-19 pandemic. In this regard, we present DEENT, an approach that includes a depression-oriented dataset built with BERT and K-means from a previous Twitter dataset labeled for sentiment analysis, and two models called DEENT-Generic and DEENT-Bert for classifying depressive and non-depressive tweets effectively. DEENT was evaluated extensively and compared to Random Forest, Support Vector Machine, XGBoost, Recurrent and Convolutional Neural Networks, and MentalBERT. Results revealed that DEENT-Bert outperforms baseline models regarding accuracy, balanced accuracy, precision, recall, and F1-Score for classifying non-depressive and depressive tweets. DEENT-Generic was better at detecting depressive tweets than baseline models. We argue that these results are due to the DEENT leveraging the encoder-only Transformer architecture and fine-tuning to detect depression in large Twitter data effectively. Therefore, we concluded that DEENT is a promising solution for detecting depressive and non-depressive tweets.

Article activity feed