Go Cry to GPT: People Offload Empathy to AI
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Empathic Artificial Intelligence (AI) can be helpful—making people feel supported—but it has a dark side. When AI can provide emotional support, people become less willing to empathize with the suffering of others. Four studies reveal “empathy offloading:” when people see AI as capable of empathy, people save themselves the emotional labor of emotionally connecting with others. Higher perceptions of AI’s capacity for empathy both correlationally (Study 1) and causally (Study 2) increase empathy offloading. When people learn that someone has received emotional support from large language models (LLMs), they become less willing to empathize with them (Studies 3 and 4). Empathy offloading extends beyond simple scale ratings and includes reduced perspective-taking (Study 3) and offering fewer empathic responses (Study 4). Empathic AI might provide low-cost, easily accessible emotional support, but it risks fraying the ethical and emotional commitments to other human beings.