Miller-Stable s-Step Conjugate Gradient and Conjugate Residual Methods
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Communication avoiding s -step Krylov methods generate s basis vectors per outer iteration to reduce global synchronization on parallel architectures. The projected Gram system arising at each outer iteration must be solved accurately, yet direct factorization becomes numerically unstable when the Gram matrix is ill-conditioned. This paper establishes that the direction of Gauss-Seidel iteration for the Gram system affects numerical stability through the Miller-Gautschi theory of three-term recurrence relations. Forward Gauss-Seidel selects the minimal solution of the underlying recurrence while Backward Gauss-Seidel can amplify the dominant solution, leading to error growth. Numerical experiments demonstrate that Forward Gauss-Seidel requires two to three times fewer outer iterations than Backward Gauss-Seidel for block sizes s ≥ q 15. The paper also establishes that Conjugate Residuals is more robust than Conjugate Gradients under inexact Gram solves because CR minimizes a local residual norm objective at each step while CG enforces a global A -conjugacy constraint that accumulates errors across iterations. The combination of Forward Gauss-Seidel with s -step CR provides a communication avoiding method with reliability approaching that of standard CG while reducing synchronization by a factor of s . Numerical experiments on Poisson problems and matrices from the SuiteSparse collection confirm the theoretical predictions and demonstrate practical effectiveness. Mathematics Subject Classification (2020): 65F10, 65F15, 65G50, 65Y05