No Coincidence, George: Capacity Limits in Cognitive Processing Reflect the Curse of Generalization

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

The striking constraints of some human cognitive processes stand in stark contrast to the nearlimitless capability of others. While we can acquire and flexibly use vast amounts ofinformation, the amount we can process at any one time is often stiflingly limited: for examplethe number of items we can hold in working memory or the number of tasks that can beperformed at once. Here, we integrate ideas from information-theory, cognitive science, andneuroscience to offer a unified account of why processing is often so limited. We argue thatthis reflects a fundamental tradeoff between generalization —how effectively existingrepresentations can be used in novel settings— and how many distinct representations can beprocessed in parallel. Representations that best promote strong forms of generalization — acharacteristically human cognitive strength — come at the expense of surprisingly strict limitsin the number of items that can be processed at once, an equally characteristic humanweakness. We refer to this as the “curse of generalization.” We formulate this first ininformation-theoretic terms, and then in process models, including a neural network modelof classic tasks used to demonstrate strict limits in human processing capacity. This tensionoffers a potential explanation for a range of phenomena — from performance on the tasks onwhich we focus, to representational learning and skill acquisition more broadly — as well as theperformance of modern machine learning architectures that exhibit generalization capabilitiescomparable to humans.

Article activity feed