Unsupervised visual learning is revealed for task-irrelevant natural scenes due to reduced attentional suppression effects in visual areas

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Unsupervised learning—learning through repeated exposure without instruction or reward—is central to both machine learning and human cognition, including language acquisition and statistical learning. However, its role in visual perceptual learning (VPL) remains debated, as previous studies have not shown VPL for task-irrelevant but visible features, particularly in artificial stimuli. Here, we show that task-irrelevant exposure to natural scene images induces robust VPL, while artificial images that lack complex structure characteristics of natural scene images, known as higher-order statistics, do not. Behavioral and fMRI results suggest that although unsupervised learning underlies VPL, it can be suppressed by top-down attention. Higher-order statistics may evade this suppression, possibly because their slower processing reaches visual areas beyond V1 outside the optimal temporal window for attentional suppression. These findings suggest that unsupervised learning underlies VPL, but its occurrence depends on both higher-order stimulus structure and the brain’s attentional gating mechanisms.

Article activity feed