Lanczos Iteration and the Spectral Measure: Gram Conditioning and Intrinsic Information Dimension
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Lanczos iteration is fundamentally a computation on the spectral measure $d\mu$ induced by the matrix $A$ and starting vector $v_1$. The stability of finite-precision Lanczos is governed by the conditioning of the Gram matrix of Krylov basis vectors, which encodes inner products of polynomials against this measure. This paper establishes that orthogonality loss, ghost eigenvalues, and numerical breakdown are consequences of Gram ill-conditioning rather than causes. Three results are proved. First, the monomial basis implicit in standard Lanczos produces Gram matrices whose condition number grows as $\kappa(A)^{2k-2}$, where $k$ is the iteration count, causing numerical breakdown before intrinsic spectral saturation. Second, Chebyshev and Newton--Leja polynomial bases maintain bounded Gram conditioning until an intrinsic limit is reached. Third, this limit, called the intrinsic information dimension $\kinfo$, depends on the spectral measure of $(A, v_1)$ and is independent of matrix dimension $n$. The framework complements Paige's backward stability analysis with forward conditioning bounds, yields new eigenvector localization results, and explains ghost multiplicity as an information-theoretic effect. A block Lanczos algorithm with Newton--Leja basis achieves full eigenvalue recovery where standard Lanczos produces ghosts. Experiments on matrices of dimension $50$ to $500$ confirm that $\kinfo \approx 31$ is constant when the spectral interval is $[1, 100]$. AMS subject classifications: 65F15, 65G50, 15A12