The Weaponisation of Artificial Intelligence (AI): Legal Shortfalls and Regulatory Difficulties in Governing Non-Consensual Intimate Deepfakes (NCIDs)
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This article examines the increasingly prevalent threat posed by non-consensual intimate deepfakes (NCIDs), AI-generated sexually explicit content which resembles a real person, and critiques the current legislative framework in the UK, which fails to criminalise the creation of NCIDs. While the Online Safety Act (OSA) 2023 criminalises the distribution of NCIDs, the simple act of creating NCIDs for sexual gratification or future criminal activity remains lawful. Utilising interdisciplinary research, victim testimony, and a comparative analysis with similar legislation within the European Union (EU), this article submits that the current legislation in the UK fails to protect victim-survivors and overlooks the serious harms caused by the creation of NCIDs. Instead, we propose a strict liability model that focuses on a lack of consent rather than a defendant’s mens rea, aligning NCID offending with the broader context of image-based sexual abuse (IBSA). This article concludes that legislative reform is needed immediately to criminalise the creation of NCIDs, close legal loopholes and, most importantly, protect the dignity, privacy and sexual autonomy of victim-survivors.