Non-asymptotic superlinear convergence of Nesterov accelerated BFGS

Read the full article

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

This paper studies the convergence of a Nesterov accelerated variant of the Broyden-Fletcher-Goldfarb-Shanno (NA-BFGS) quasi-Newton method in the setting where the objective function is strongly convex, its gradient is Lipschitz continuous, and its Hessian is Lipschitz continuous at the optimal point. We demonstrate that, similar to the classic BFGS method, the Nesterov accelerated BFGS method also achieves a non-asymptotic superlinear convergence rate within a local neighbourhood of the optimal point. The work provides a theoretical explanation of the superlinear convergence of NA-BFGS and explicitly compares it with the classical BFGS method. Further, we validate the theoretical result with numerical experiments.

Article activity feed