Human-AI Relationships and Their Therapeutic Implications
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
PurposeAI companions are increasingly integrated into individuals' emotional lives, yet unlike task-oriented chatbots, these systems focus on emotional bonds through personalisation and memory, simulating human connection. This paper examines the nature of human-AI companionship relationships and their clinical implications, equipping clinicians with the conceptual tools to understand and assess these relationships within therapeutic practice.ApproachThe paper frames AI companionship as a parasocial relationship, characterised by emotional investment in a non-reciprocal entity that provides frictionless interactions. It examines loneliness as a primary driver for seeking AI companionship, analyses the conditioning cycle underpinning these relationships, and considers the clinical risks arising when systems not designed for mental health support are used in emotionally vulnerable contexts.FindingsAI companion relationships carry significant clinical risks, including the failure to identify distress, the potential expression of stigmatising content, and in extreme cases, the facilitation of suicidal ideation. The central clinical concern is that frictionless AI relationships may displace human connection and inadvertently exacerbate existing psychopathology. OriginalityThis paper offers a clinically grounded framework for understanding AI companionship as a distinct relational phenomenon and highlighting the novel risks it poses.Practical ImplicationsClinicians are advised to integrate AI companion questions routinely into clinical assessments, formulate the role of these relationships within the client's broader psychological world, and leverage the human therapeutic alliance as a corrective relational experience. Intervention should focus on building real-world social skills and providing ethical guidance on AI mental health tools, ensuring that technology serves rather than supplants authentic human relationships.