Acceleration of conjugate gradient algorithms for unconstrained optimization

作者:

Highlights:

摘要

Conjugate gradient methods are important for large-scale unconstrained optimization. This paper proposes an acceleration of these methods using a modification of steplength. The idea is to modify in a multiplicative manner the steplength αk, computed by Wolfe line search conditions, by means of a positive parameter ηk, in such a way to improve the behavior of the classical conjugate gradient algorithms. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with some conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the accelerated computational scheme outperform the corresponding conjugate gradient algorithms.

论文关键词:Acceleration methods,Conjugate gradient,Wolfe line search,Line search gradient methods,Unconstrained optimization

论文评审过程:Available online 19 March 2009.

论文官网地址:https://doi.org/10.1016/j.amc.2009.03.020