Moderation of Virtual Worlds and Its Discontents: A Walkthrough Case Study of Roblox

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Immersive virtual worlds are increasingly scrutinized for the novel harms their techno-social affordances enable, yet traditional content moderation approaches have proven inadequate. In this article, we analyse Roblox—a highly popular yet under-researched platform with over 100 million daily active users— as a case study, using the interface walkthrough method across VR and mobile versions. We adopt a multi-layered governance approach that examines moderation at i) platform, ii) developer, and iii) user level, illustrating how stakeholders negotiate norms, shape behaviour, and co-mitigate harm. Our findings reveal what we term pseudo-decentralization: the platform often delegates responsibility for harm mitigation downward to developers, users, and parents, while simultaneously centralizing decision-making, platform infrastructure and monetization. Ambiguous responsibility divisions and opaque automated systems produce accountability gaps, which users address through vigilante behaviour and grassroots organizing. We conclude that effective moderation in virtual worlds requires collaborative, transparent governance that balances power and responsibility across stakeholders.

Article activity feed