Tracking visual rhythms: a concert of sensory and motor simulation
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Neural oscillations have been proposed to model external temporal structure by phase-coupling to environmental rhythms, thereby supporting adaptive perception. However, there is little evidence supporting these theories, particularly in the visual domain, and the underlying mechanisms remain unclear. Using MEG and a new empirical approach we addressed this issue. Participants attended 1.3 and 2 Hz visual displays of rotating Gabors and judged either the timing or content of these events. We show behaviourally-relevant rate-specific phase-coupling in motor structures to - and beyond - the visual rhythm specifically when judging temporal features of the display. We subsequently devised a rate-specific decoding measure to show that visual structures track anticipated, temporally-precise content regardless of task. This sensory simulation predicted the temporal tracking in motor structures. We consequently propose a mechanism by which automatic, temporally-specific sensory simulation yields an information envelope read out by motor areas when estimating temporal characteristics in our environment.