Robust Hybrid Conjugate Gradient Algorithms via Projection for Large-Scale Optimization and Compressed Sensing

Read the full article See related articles

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper introduces two novel and highly effective hybrid conjugate gradient (CG) methods, integrating the theoretical strengths of the Dai--Yuan (DY) method with the computational efficiency of the Rivaie--Mustafa--Ismail--Leong (RMIL) method. These approaches derive from a convex combination of their conjugate parameters, differentiated by how the hybridization parameter is calculated: one ensures conjugacy independent of any line search, while the other draws inspiration from quasi--Newton (QN) methods and the standard secant condition. These methods are particularly well-suited for addressing complex, large-scale unconstrained optimization problems prevalent in various scientific and engineering disciplines. Both methods guarantee the sufficient descent property by projecting the search direction onto the gradient's orthogonal subspace, independent of line search or objective function convexity. We rigorously establish the global convergence analysis for general objective functions under standard assumptions. Comprehensive numerical experiments on CUTEr test problems and a compressed sensing application demonstrate the superior performance, remarkable robustness, and enhanced computational efficiency of our proposed algorithms compared to existing state-of-the-art CG methods, thereby offering significant advancements in optimization techniques. MSC Classification: 90C06; 49M37

Article activity feed