Conjugate residual acceleration of proximal gradient methods with optimal trial vectors

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

We consider composite convex optimization problems consisting of a smooth term with a Lipschitz continuous gradient and a nonsmooth but proximable term, which are commonly solved using proximal gradient (PG) methods. We present a novel accelerated PG method that integrates conjugate residuals with optimal trial vectors (CROP), resulting in the CROP-PG algorithm. By selecting informative trial directions and employing a nonmonotone acceptance strategy, CROP-PG significantly enhances convergence while retaining the simplicity and efficiency of standard proximal updates. The method is designed to be robust across diverse problem settings, including large-scale and poorly conditioned optimization tasks. Extensive numerical experiments on benchmark problems, including LASSO, nonnegative regression, and $\ell_1$-regularized logistic regression, demonstrate that CROP-PG consistently outperforms classical PG methods, FISTA, and Anderson-accelerated schemes in terms of iteration count and computational time. The basic convergence properties of the proposed method are also established, and the high acceptance rates of the accelerated steps enable CROP-PG to effectively exploit the curvature information without compromising stability. These results establish the CROP framework as a general, reliable, and efficient approach to accelerate PG-type algorithms in large-scale optimization problems. Mathematics Subject Classification (2000) 65K05 · 90C25 · 90C06

Article activity feed