Mistaking covariance for combination in sensorimotor adaptation: Regression slopes do not test additivity
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Sensorimotor adaptation depends on implicit recalibration and explicit strategy. These processes are commonly assumed to sum (A = I + E), and this additivity assumption justifies subtractive measurement and informs computational models of motor learning. Recent work has challenged additivity by examining regression slopes between implicit and explicit measures. When slopes deviate from β = –1, the interpretation has been that the processes are "sub-additive" and fail to sum as expected. Here, we show this reasoning is mistaken. Regression slopes reflect covariance structure: how learning processes relate across individuals. Additivity is a claim about motor output combination: whether learning processes sum within individuals. These are different questions, and regression slopes do not address the latter. We derive the expected slope under subtractive logic and show it equals β = –1 only when total adaptation is uncorrelated with the measured learning component. Monte Carlo simulations confirm this benchmark is routinely rejected under realistic covariance between learning processes, even when additivity is enforced. Under independent measurement, the slope still depends on the covariance structure of the learning processes, which is not constrained by additivity. For this reason, there is no regression slope benchmark for diagnosing additivity. Moreover, the regression slopes reported in the literature fall within the range predicted by shared-error models where implicit and explicit systems are driven by the same error signals, which adhere to the additivity assumption. Regression slopes only tell us whether learning processes covary, not whether they violate additivity.