Performance rather than Reputation Affects Humans’ Trust towards an Artificial Agent
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
To succeed in teamwork with artificial agents, humans have to calibrate their trust to-wards agents based on information they receive about an agent before interaction (reputation information) as well as on experiences they have during interaction (agent performance). This study (N = 253) focuses on the influence of a virtual agent’s reputation (high/low) and actual observed performance (high/low) on a human user’s behavioral trust (delegation behavior) and self-reported trust (questionnaires) in a cooperative Tetris game. The main findings suggest that agent reputation influences self-reported trust prior to interaction. However, the effect of reputation immediately gets overridden by performance of the agent during the interaction. The agent’s performance during the interactive task influenced delegation behavior as well as self-reported trust measured post-interaction. Pre- to post-change in self-reported trust is significantly larger when reputation and performance are incongruent. We conclude that reputation might have a smaller than expected influence on behavior in the presence of a novel tool that affords exploration. Our research contributes to understanding trust and delegation dynamics, which is crucial for the design and adequate use of artificial agent team partners in a world of digital transformation.