How close is too close? Digital replicas and protection of the “unfixed” voice.
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Advancements in Generative Artificial Intelligence (AI), particularly in the creation of ‘digital replicas’, have resulted in a significant increase in personal, ethical and legal concerns, leading to challenges in regulation and in public receptivity. This paper, arising from an interdisciplinary team, is the first to explore the legal challenges posed by digital replicas set within the broad context of sociological and linguistic perspectives. Owing to the prevalence of AI digital voice replicas in the creative industries, we use qualitative focus group data to focus on the legal implications and potential impact of AI digital voice replicas within this domain. To do this, we draw on the views of researchers, creatives and IP practitioners. Our focus group findings illustrated a range of responses to AI and voice cloning including skepticism, uncertainty, and a sense of inevitability. We find that the loss of what many refer to as the ‘authentic voice’ through cloning evokes an emotional response. The voice means a lot to people. As AI evolves, we posit that authenticity as a concept itself may need reconceptualising. Against this backdrop, we highlight the urgent need to address the legal vulnerability of the “un-fixed” voice, (a voice not permanently captured in traditional recordings), and the inadequacy of current United Kingdom (UK) Intellectual Property (IP) frameworks to protect it, we find that those lacking such status remain largely unprotected, highlighting a significant regulatory lacuna. This shortfall calls for a more robust protection of a ‘personality right’ in UK law.