A Finite-Size Scaling Theory for Fractal Dimension Estimators in Complex Networks: Inconsistency, Decomposition, and Adaptive Correction
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Ball-mass scaling algorithms for fractal dimension estimation in complex networks, including the recently proposed AFRACT, systematically underestimate the true fractal dimension Df on finite graphs. This paper provides the rst rigorous analysis of this finite-size bias. We prove three results. Theorem 1 (Bias decomposition): the total bias decomposes into a boundary term B∂(k), arising from nodes whose ε-balls are clipped by the graph boundary, and a curvature term Bκ, arising from sub-leading terms in the mass function. For d-dimensional lattices of linear scale k = N1/Df , B∂(k) = [2d Λ(α)/Vd] · k−1 where Λ(α) = Cov[1,αk](log ε, ε)/Var[1,αk](log ε) and Vd is the volume of the unit ℓ1 ball. Theorem 2 (Inconsistency): when the regression range is εmax = α · diam(G) with α ∈ (0, 1) fixed, ˆDf does not converge to Df as N → ∞; the asymptotic bias B∞(α) = limk→∞(Df − ˆDf (k)) > 0. Corollary (Adap- tive consistency): choosing εmax = diam(G)/log diam(G) yields a consistent estimator with convergence rate O(logN/N1/Df ). We validate all three results numerically on 2D and 3D lattices and Sierpi «ski gaskets across five orders of magnitude in N. The adaptive range rule reduces the mean absolute error in Df by 37-58% compared to the fixed-range rule.