Disentangling objects′ contextual associations from perceptual and conceptual attributes using time-resolved neural decoding
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Humans effortlessly relate what they see to what they know, drawing on existing knowledge of objects′ perceptual, conceptual, and contextual attributes while searching for and recognising objects. While prior studies have investigated the temporal dynamics of perceptual and conceptual object properties in the neural signal, it remains unclear whether and when contextual associations are uniquely represented. In this study, we used representational similarity analysis on electroencephalography (EEG) data to explore how the brain processes the perceptual, conceptual, and contextual dimensions of object knowledge over time. Using human similarity judgments of 190 naturalistic object concepts presented as either as images or words, we constructed separate behavioural models of objects′ perceptual, conceptual, and contextual properties. We correlated these models with neural patterns from two EEG datasets, one publicly available and one newly collected, both recorded while participants passively viewed the same object stimuli. Across both datasets, we found that perceptual features dominated the early EEG response to object images, while conceptual features emerged later. Contextual associations were also reflected in neural patterns, but their explanatory power largely overlapped with that of conceptual models, suggesting limited unique representation of objects′ contextual attributes under passive viewing conditions. These results highlight the brain′s integration of perceptual and conceptual information when processing visual objects. By combining high temporal resolution EEG with behaviourally derived models, this study advances our understanding of how distinct dimensions of object knowledge are encoded in the human brain.