Form Composition: The Generative Structural Engine of Metaphors in Human Cognition

Read the full article

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper explores the origins of non-literal meaning in human cognition and proposes a generative structural engine based on each concept’s central form—a reductive, domain-general format embedded within every concept we learn, which encodes a small subset of each concept’s information with a default sense of certainty. Recent work has demonstrated that people rely on form composition when generating and interpreting complex thoughts—a default mechanism for combining forms to generate meaning. The present paper tests the hypothesis that form composition operates across domain boundaries in a way that naturally and immediately generates metaphorical meanings. Across four experiments, participants judged it possible to combine central forms to produce metaphorical meaning—even when the metaphors and the concepts involved were novel. In contrast, they rejected metaphors involving non-form information as impossible. These findings suggest that form composition enables the spontaneous generation of non-literal meaning. This challenges the view that metaphor is secondary or derivative, instead positioning it as deeply interwoven with, and a natural outcome of, the very mechanisms by which humans combine concepts. The work carries broad implications for our understanding of concept learning, linguistic creativity, LLMs, and illuminates byproducts, such as the cognitive bias toward reductive reasoning across domains.

Article activity feed