Interpersonal Synchrony in Human-Robot Interaction: Sensor Analysis of Interpersonal Interaction Using VR Data

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

We investigated interpersonal synchrony in a virtual reality (VR) environment using sensor data collected from pairs of participants. In the VR setting, 28 participants were positioned face-to-face and tasked with identifying the category of a central object, responding through physical movements. Using three-axis head acceleration data from Head-Mounted Displays (HMDs), we developed a machine learning model with Long Short-Term Memory (LSTM) to classify interactions as competitive (‘Game’ category) or synchronous (‘Collab’ category), achieving 93% accuracy. Time series analysis revealed that synchronization patterns differed between Game and Collab, with Collab-type synchrony interpreted as participants consciously matching each other. This model was applied to a human-robot collaborative task to examine synchrony with bot avatars. Synchrony was assessed using Dynamic Time Warping (DTW) distance, with experimental conditions manipulating the bot’s correct response rates (Accuracy condition). Analysis revealed that the main effect of the Accuracy condition was not statistically significant; however, the interaction between the Accuracy condition and the Predicted category had a significant effect on DTW distance, with a significant decrease in DTW distance in the Game category under the Low accuracy condition (p < .001). The results suggest that unintentional synchronization is disrupted when the bot behaves unpredictably.

Article activity feed