AI conversation partners exceed expectations and elicit less anticipatory anxiety than human partners

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Does merely knowing that one will engage in conversation with an AI versus a real human differentially evoke anticipatory anxiety? Here we tested this question experimentally by asking participants to report their anticipatory anxiety and experiential expectations prior to engaging in each of two conversations: one with real humans and another with LLM-powered AI agents, with both held through the same real-time speech-to-text and text-to-speech interface. Then, to evaluate how outcomes aligned with expectations, participants reported their actual experiences following each conversation. Results revealed significantly lower anxiety in anticipation of a conversation with AI partners compared to human partners, though this effect only emerged after initial novelty subsided. Strikingly, participants underestimated the hedonic favorability of AI conversations, ultimately liking them and feeling better after them than the human conversations in spite of relatively similar expectations. These findings support the notion that anxiety in anticipation of social interaction is substantially evoked by being perceived by another human, rather than the act of conversation itself. As social AI agents become pervasive across social domains, understanding the similarities and differences in the ways we relate to them will be important for maximizing the benefits they can provide and minimizing their potential for harm.

Article activity feed