AFRACT: Autocorrelation-Aware Fractal Dimension for Complex Networks

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Estimating the fractal dimension Df of a complex network is fundamental to understanding its self-similar structure, yet existing methods based on boxcovering are sensitive to hub concentration and provide no way to incorporate node-property information. We introduce AFRACT (Autocorrelation-aware Fractal dimension for Complex neTworks), a ball-mass scaling algorithm that addresses both limitations. The core idea is simple: instead of counting nodes within a growing ball, AFRACT weights each node by a function of its property value and its spatial autocorrelation with the centre, capturing how local order decays with topological distance. We establish a complete axiomatic framework for the weighting function, prove that property-weighted and unweighted estimates converge to the same Df asymptotically (power-law preservation theorem), and show that the weight matrix is circulant on vertex-transitive graphs, enabling an exact FFT-based computation that achieves a 471× speedup over the direct method. On four deterministic networks with analytically known Df (Sierpi«ski gasket and carpet, 2D and 3D lattices, N up to 3375), AFRACT achieves R2 ≥ 0.998. We derive a universal nite-size correction law Df ≈ ˆDf + cG/N1/Df confirmed across three graph classes with R2 > 0.99. Against Compact Box Burning on five diverse networks, AFRACT achieves higher goodness-of-fit, 4.7× better noise robustness, and avoids the hub-induced overestimation that causes CBB to return ˆDf = 3.98 on BarabásiAlbert networks.

Article activity feed