Preconditioned Nonlinear Conjugate Gradient methods based on a modified secant equation

作者:

Highlights:

摘要

This paper includes a twofold result for the Nonlinear Conjugate Gradient (NCG) method, in large scale unconstrained optimization. First we consider a theoretical analysis, where preconditioning is embedded in a strong convergence framework of an NCG method from the literature. Mild conditions to be satisfied by the preconditioners are defined, in order to preserve NCG convergence. As a second task, we also detail the use of novel matrix–free preconditioners for NCG. Our proposals are based on quasi–Newton updates, and either satisfy the secant equation or a secant–like condition at some of the previous iterates. We show that, in some sense, the preconditioners we propose also approximate the inverse of the Hessian matrix. In particular, the structures of our preconditioners depend on low–rank updates used, along with different choices of specific parameters. The low–rank updates are obtained as by–product of NCG iterations. The results of an extended numerical experience using large scale CUTEst problems is reported, showing that our preconditioners can considerably improve the performance of NCG methods.

论文关键词:Nonlinear Conjugate Gradient method,Large scale optimization,Secant equation,Low–rank updates

论文评审过程:Received 1 February 2017, Revised 19 July 2017, Accepted 11 August 2017, Available online 5 September 2017, Version of Record 18 October 2017.

论文官网地址:https://doi.org/10.1016/j.amc.2017.08.029