Fine-Grained Sentiment Mining, at Document Level on Big Data, using a state-of-the-art Representation-based Transformer: ModernBERT
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
As our active and passive digital footprints continuously aggregate into Big Data (bData); correspondingly, the field of Artificial Intelligence (AI) has continually provided methodologies and tools for exploiting this Big Data. Taking into consideration the prevalence of Big Data, Sentiment Mining (SM) and Opinion Mining (OM) or Aspect-based Sentiment Analysis (ABSA) have increasingly become interesting and relevant topics – within the subfield of Social Network Analysis (SNA) – with respect to the field of Artificial Intelligence. Thus, “fine-grained” Sentiment Mining or Opinion Mining essentially focuses on determining deeper intensities of users’ emotions or viewpoints with respect to a given topic. Our work herein aims at examining and exploiting Big Data collections, with the goal of extracting fine-grained sentiment(s) on a given topic, using state-of-the-art representation-based transformer architectures. Several existing literature have exploited Structured Data for Sentiment Mining using Recurrent Neural Network (RNN) and Recursive Neural Network (RvNN) architectures. However, and majorly due to computational constraints, only a few existing literature have exploited Big Data for Sentiment Mining using transformer-based architectures. To this end, our research herein contributes to the latter existing literature and fills the literature-gap via employing a dedicated, high-end, enterprise-grade data center Graphics Processing Unit (GPU) in a bid to overcome common computational constraints associated with harnessing Big Data and post-training transformer architectures. Our proposed framework leverages the fundamental architecture of encoder-only transformers, irrespective of noisy data, with respect to a pre-trained Modern Bidirectional Encoder Representations from Transformers (ModernBERT) architecture. In this regard, the results of our experiments aggregated herein have been very auspicious with respect to the objective functions employed in our research.