Complexity Amplification in Fixed-Volume Neural Systems
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Complex systems are not constrained to increase in complexity by expanding their physical footprint. Neural systems, particularly the human brain, demonstrate that large-scale increases in functional richness, representational capacity, and dynamical repertoire can occur under strict volumetric limits. This manuscript develops a unified theoretical and computational account of complexity amplification in fixed-volume systems, drawing on multiscale network theory, integrated-information frameworks, compressibility analyses, and metastable coordination dynamics. The model shows mathematically and computationally how informational capacity increases through differentiation, integration, and dynamic coordination, all without violating the Monro–Kellie doctrine or energy constraints. A full Monte Carlo simulation suite verifies monotonic growth of complexity under realistic developmental dynamics. The principles generalize across computation, ecosystems, physics, and engineered systems.