Physical Network Constraints Define the Lognormal Architecture of the Brain’s Connectome
Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The brain has long been conceptualized as a network of neurons connected by synapses. However, attempts to describe the connectome using established network science models have yielded conflicting outcomes, leaving the architecture of neural networks unresolved. Here, by performing a comparative analysis of eight experimentally mapped connectomes, we find that their degree distributions cannot be captured by the well-established random or scale-free models. Instead, the node degrees and strengths are well approximated by lognormal distributions, although these lack a mechanistic explanation in the context of the brain. By acknowledging the physical network nature of the brain, we show that neuron size is governed by a multiplicative process, which allows us to analytically derive the lognormal nature of the neuron length distribution. Our framework not only predicts the degree and strength distributions across each of the eight connectomes, but also yields a series of novel and empirically falsifiable relationships between different neuron characteristics. The resulting multiplicative network represents a novel architecture for network science, whose distinctive quantitative features bridge critical gaps between neural structure and function, with implications for brain dynamics, robustness, and synchronization.