Distinct prior expectations shape tactile and proprioceptive localization
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
When a mosquito lands on your finger, swatting it away requires your brain to calculate its location in the external space, which depends on the body’s 3D posture. Two competing hypotheses explain how the brain solves this challenge: the integration hypothesis , where tactile signals are transformed into spatial coordinates by integrating touch and posture information; and the cueing hypothesis , where touch merely cues a location on the body whose position is specified via proprioception. Adjudicating between these hypotheses is nearly impossible without modeling the latent factors underlying somatosensory spatial perception. We fill this gap in the present study. We first formalized each hypothesis from a Bayesian perspective: If touch merely triggers proprioceptive localization (cueing hypothesis), tactile and proprioceptive localization should rely on the same Bayesian computations, with identical prior expectations about the mosquito’s spatial location; If they involve distinct Bayesian computational processes (integration hypothesis), distinct prior expectations may shape tactile and proprioceptive localization. To test these predictions, we had nineteen participants localize either proprioceptive or tactile targets on their fingertips. We then fit their data with several Bayesian models of each hypothesis. Models allowing different prior distributions between modalities provided the best fit for most participants, with 17 out of 19 participants showing significantly different prior distributions across modalities. These provide strong computational evidence that tactile and proprioceptive localization rely on distinct computational mechanisms, a conclusion that has important implications for how we understand these everyday behaviors and their neural mechanisms.
Author Summary
When a mosquito lands on your finger and you swat it away, your brain must solve a challenging problem: determining where the mosquito is in space based on where it touched your skin and where your finger is positioned. Scientists have debated how the brain accomplishes this. One hypothesis proposes that the brain transforms touch signals by combining them with information about body posture—an integration process. An alternative hypothesis suggests that touch simply signals which body part was contacted, and the brain then locates that body part with only proprioception—essentially treating touch as only a cue. While these hypotheses make different predictions, distinguishing between them using behavior alone has proven difficult because the underlying computations remain hidden. We addressed this by having participants locate either touches on their fingertips or the fingertips themselves, then used Bayesian computational modeling to reveal the spatial expectations guiding these judgments. Our models showed that tactile and proprioceptive localization rely on distinct spatial expectations, with 17 of 19 participants showing significantly different patterns. These findings provide computational evidence that localizing touch involves transformations beyond simply locating the body, supporting the integration hypothesis and challenging the idea that touch merely cues body location.