Reversible Computation as the Generator of Mutual Information: Toward a Thermodynamic Theory of Structural Persistence
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Mutual information is a foundational concept in information theory, capturing how much one state of a system tells us about another. Reversible computation, in contrast, has traditionally been explored in the context of energy efficiency and logical reversibility, where information is preserved without heat dissipation. In this paper, we propose a novel and foundational insight: mutual information arises as a consequence of reversible computation. That is, the measurable information retained between successive states of a system reflects the underlying reversibility of its computational or physical transitions.We formalize this idea through an informational theorem: in any bounded or closed system, the existence of non-zero mutual information between temporally or spatially separated states implies the presence of reversible computation or structural memory. This theorem bridges Shannon’s theory with Landauer’s principle, linking entropy, reversibility, and persistence in a unified framework. We explore its implications for condensed matter physics, phase transitions, neural systems, and the foundations of cognition and artificial intelligence.This work builds toward a broader theory in which information persistence — rather than energy conservation alone — becomes the key unifying principle across physical and cognitive systems.