Imaging the dancing brain: Decoding sensory, motor and social processes during dyadic dance
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Real-world social cognition requires processing and adapting to multiple dynamic information streams. Interpreting neural activity in such ecological conditions remains a key challenge for neuroscience. This study leverages advancements in de-noising techniques and multivariate modeling to extract interpretable EEG signals from pairs of participants engaged in spontaneous dyadic dance. Using multivariate temporal response functions (mTRFs), we investigated how music acoustics, self-generated kinematics, other-generated kinematics, and social coordination each uniquely contributed to EEG activity. Electromyogram recordings from ocular, face, and neck muscles were also modelled to control for muscle artifacts. The mTRFs effectively disentangled neural signals associated with four key processes: (I) auditory tracking of music, (II) control of self-generated movements, (III) visual monitoring of partner movements, and (IV) visual tracking of social coordination accuracy. We show that the first three neural signals are driven by event-related potentials: the P50-N100-P200 triggered by acoustic events, the central lateralized readiness potential triggered by movement initiation, and the occipital N170 triggered by movement observation. Notably, the (previously unknown) neural marker of social coordination encodes the spatiotemporal alignment between dancers, surpassing the encoding of self-or partner-related kinematics taken alone. This marker emerges when partners make visual contact, relies on visual cortical areas, and is specifically driven by movement observation rather than initiation. Using data-driven kinematic decomposition, we further show that vertical movements best drive observers’ EEG activity. These findings highlight the potential of real-world neuroimaging, combined with multivariate modelling, to uncover the mechanisms underlying complex yet natural social behaviors.
Significance statement
Real-world brain function involves integrating multiple information streams simultaneously. However, due to a shortfall of computational methods, laboratory-based neuroscience often examines neural processes in isolation. Using multivariate modelling of EEG data from pairs of participants freely dancing to music, we demonstrate that it is possible to tease apart physiologically-established neural processes associated with music perception, motor control and observation of movement produced by a dance partner. Crucially, we identify a previously unknown neural marker that encodes coordination accuracy between dancers, beyond the contributions of self-or partner-related kinematics alone. These findings highlight the potential of computational neuroscience to uncover the biological mechanisms underlying real-world social and motor behaviors, advancing our understanding of how the brain supports dynamic and interactive activities.