Humans can learn bimodal priors in complex sensorimotor behaviour
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Extensive research suggests that humans integrate sensory information and prior expectations in a Bayesian manner to reduce uncertainty in perception and action. However, while Bayesian integration provides a powerful explanatory framework, the question remains as to what extent it explains human behaviour in naturalistic situations, including more complex movements and distributions. Here, we examine whether humans can learn bimodal priors in a complex sensorimotor task: returning tennis serves. Participants returned serves in an immersive virtual reality setup with realistic movements and spatiotemporal task demands matching those in real tennis. The location of the opponent’s serves followed a bimodal distribution. We manipulated visual uncertainty through three levels of ball speeds: slow, moderate, and fast. After extensive exposure to the opponent’s serves, participants’ movements were biased by the bimodal prior distribution. As predicted by Bayesian theory, the magnitude of the bias depends on visual uncertainty. Additionally, our data indicate that participants’ movements in this complex task were not only biased by prior expectations but also by biomechanical constraints and associated motor costs. Intriguingly, an explicit knowledge test after the experiment revealed that, despite incorporating prior knowledge of the opponent’s serve distribution into their behaviour, participants were not explicitly aware of the pattern. Our results show that humans can implicitly learn and utilise bimodal priors in complex sensorimotor behaviour.