Inter-individual similarities in internal models of the world shape similarities in the perception and neural processing of scenes

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Every individual person perceives the visual world in their own unique way, yet we still know little about the origins of these individual differences. Here, we propose that idiosyncrasies in internal models—mental representations of what the world should look like—shape how individuals perceive and process natural scenes. To characterize these internal models, participants drew what they considered the most typical version of a given scene category. Using a combination of deep learning tools, we quantified inter-subject similarities in these drawings and used them to predict inter-subject similarities in perceptual task performances and neural responses to a fully independent set of natural scenes. Individuals with more similar internal models showed more similar scene categorization performance and judged several scene properties (typicality, usability, and complexity) more similarly. Moreover, variations in participants' internal models predicted inter-subject correlations in BOLD time courses in the lateral occipital and lateral prefrontal cortices. These results provide a novel, mechanistic explanation of how perceptual and neural alignment across individuals is shaped by idiosyncrasies in prior experience.

Article activity feed