Task-relevant information probability shapes eye movements and perceptual judgment confidence

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Humans continuously decide where to look to gather task-relevant information. While affective rewards such as money are known to bias gaze direction, it remains unclear whether non-affective informational value can similarly shape oculomotor decisions. Here, we modulated the probability of finding task-relevant visual information at saccade targets in human participants performing a perceptual judgment task. Results showed that participants developed implicit biases, increasingly avoiding the low-information region. These learned preferences were also reflected in longer saccade latencies toward non-preferred regions, similar to patterns observed with affective reward learning. However, saccade peak velocity remained unchanged across locations. Perceptual accuracy was not influenced either. When participants' confidence ratings reliably distinguished correct from incorrect responses, confidence was higher for preferred regions, suggesting a dissociation between perceptual and metacognitive performance. These findings demonstrate that the probability of accessing usable information can be implicitly learned to guide eye movement decisions, much like reward. Moreover, learned preferences can influence subjective confidence without altering perceptual performance. Our results highlight that informational value, independent of affective cues, shapes oculomotor decision-making and post-perceptual judgment confidence.

Article activity feed