Technocultural Hegemony: What Role Does Natural Language Processing Play in the Reinforcement of Dominant Cultural Narratives?
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
While Natural Language Processing (NLP) tools keep gaining popularity among users from around the globe, their vast majority is developed in the west, mainly the US. Although plenty of studies have shown that NLP tools don't perform equally well in different languages and cultural contexts, little research has been conducted on the broader consequences of such performance disparities. By using the evidence from previous research, this study aims to bridge this gap and explore how NLP and the existing cultural hierarchies can be mutually constitutive. This paper first reviews existing literature on the NLP tools' performance in relation to underrepresented languages and non-western cultures. It then takes a critical theory approach to examining the broader cultural implications of the shortcomings identified during the review. More specifically, this work uses the concept of technoculture proposed by Leila Green, to connect the technological and cultural aspects of NLP, and refers to Gramsci's theory of cultural hegemony to explore how the bias in NLP tools reinforces dominant cultural narratives overrepresented in the training data. This study argues that NLP applications play a role in reinforcing the dominant norms, ideals, and ways of expression as universal, thus marginalizing alternative worldviews and imposing normative standards of communication onto the users of different backgrounds. The analysis concludes that, as the popularity of NLP tools keeps growing worldwide, their influence on what is perceived as "common sense" will increase too. This study emphasizes the importance of ensuring equitable representation of the user base throughout the whole NLP development pipeline.