Adaptive Gaze and Hand Coordination while Manipulating and Monitoring the Environment in Parallel

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Research on eye-hand coordination has focused on action tasks performed in isolation. However, real world action tasks are often performed concurrently with perception tasks that compete for gaze. Here we examine how participants adapt their eye and hand movements when performing an object manipulation task—in which they repeatedly grasped a ball and inserted it into a slot—while simultaneously monitoring a text display to detect probabilistically occurring letter changes. We varied the visuomotor demands of the action task by having participants use either their fingertips or tweezers. We found that fixations allocated to the action task were exclusively directed to the ball and slot, and were more prevalent when using tweezers. The timing of ball and slot fixations were coupled in time with ball grasp and slot entry. On average, gaze shifted away from the landmarks ∼400 ms before contact when using fingertips—allowing the use of peripheral vision to direct the hand—and around the contact time when using tweezers—further allowing central vision to guide the hand as it approached the ball or slot. We found that participants controlled the timing of their hand movements, as well as the timing and patterns (sequence of fixations) of their eye movements, to exploit the temporal regularities of the perception task, thereby lowering the probability that a letter change would occur during action task fixations. Our results illustrate that eye-hand coordination can be flexibly and intelligently adapted when simultaneously acting on and perceiving the environment.

Article activity feed