Improved Conjugate Gradient Methods Based on Scaling Techniques for Large-Scale Unconstrained Optimization

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

The well-known class of Conjugate Gradient (CG) methods represents an important computational tool in solving large-scale unconstrained optimization problems. We study the recent class of scaled CG techniques and propose some choices for the parameter, which guarantee the sufficient descent condition. The quasi-Newton feature is also introduced to the class of methods in a sense to be defined. Hence, we enforce the global convergence property for several choices of the parameter in several cases. The behavior of the proposed methods is tested by applying them to a set of standard test problems. The reported numerical results show that the proposed scaled techniques improve the performance of several CG methods substantially.

Article activity feed