Emergent Symbolic Cognition: A Unifying Computational Framework for Symbolic Thought in Humans and LLMs

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Given its ubiquity in mental life, the precise role of language in enabling structured thought remains difficult to study in humans. However, the surprising emergence of fluent and coherent language in fundamentally connectionist Large Language Models (LLMs)—systems learning only from text—provides a novel empirical lens on language's role in cognition, challenging historical symbolic-connectionist dichotomies and compelling a re-evaluation. We introduce the Emergent Symbolic Cognition (ESC) framework, proposing that general intelligence arises from the interaction between a parallel, adaptive substrate (like a brain or Artificial Neural Network) and a structured symbolic framework, like language, internalized through statistical learning. The substrate's parallel architecture enables fast, intuitive pattern-matching, but is fundamentally ill-suited for the serial processing required for structured thought (e.g., causal reasoning, logical deduction). ESC proposes that recursive symbolic generation is the mechanism that overcomes this limitation. By generating a symbol (word or token) and feeding it back as input, the substrate operates as a serial symbolic processor, akin to a Turing machine. This recursive loop allows it to explore the vast combinatorial space of potential thoughts and construct novel, structured solutions. ESC finds compelling support in diverse phenomena across both biological cognition—evidenced by neuroscience, inner speech and cross-cultural studies—and the nascent reasoning and abstract representations displayed in LLMs. ESC thus offers a unifying framework, reconciling connectionist learning with symbolic competence by reframing language as an inherited cognitive engine for general intelligence.

Article activity feed