BERTAgent: The Development of a Novel Tool to Quantify Agency in Textual Data

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Pertaining to goal-orientation and achievement, agency is a fundamental aspect of human cognition and behavior. Accordingly, detecting and quantifying linguistic encoding of agency is critical for the analysis of human actions, interactions, and social dynamics. Available agency-quantifying computational tools rely on word-counting methods, which typically are insensitive to the semantic context in which the words are used and consequently prone to miscoding, for example in case of polysemy. Additionally, some currently available tools do not take into account differences in the intensity and directionality of agency. In order to overcome these shortcomings, we present BERTAgent, a novel tool to quantify semantic agency in text. BERTAgent is a computational language model that utilizes the transformers architecture, a popular deep learning approach to natural language processing. BERTAgent was fine-tuned using textual data that were evaluated by human coders with respect to the level of conveyed agency. In four validation studies, BERTAgent exhibits improved convergent and discriminant validity compared to previous solutions. Additionally, the detailed description of BERTAgent’s development procedure serves as a tutorial for the advancement of similar tools, providing a blueprint for leveraging the existing lexicographical datasets in conjunction with the deep learning techniques in order to detect and quantify other psychological constructs in textual data.

Article activity feed