From Dennett to Transformers: Emergent Properties, Kolmogorov Complexity, and the Turing-Computability of Human Intelligence
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
This paper explores the Turing-computability of human cognition through the lens of the late philosopher Daniel Dennett’s computational functionalism, Kolmogorov Complexity, and recent advances in large language models (LLMs). We compare the high complexity and capacity of the human brain to the deliberate scaling of LLMs, noting that emergent properties—such as zero-shot reasoning—arise when models reach critical thresholds in parameter count and training data. By analyzing whether these emergent capabilities can approximate human-level intelligence, we shed light on the debate concerning the algorithmic replicability of human consciousness. We suggest that capacity, complexity, and scaling play pivotal roles in shaping advanced cognitive behaviors, offering preliminary insights into artificial general intelligence (AGI). Although the emphasis is primarily theoretical, we briefly note that transformer models have the potential for unlocking new analytical capabilities in fields such as astrophysics and cosmology, highlighting the broad impact of scaling laws beyond natural language alone.