New accelerated conjugate gradient algorithms as a modification of Dai–Yuan’s computational scheme for unconstrained optimization
作者:
Highlights:
•
摘要
New accelerated nonlinear conjugate gradient algorithms which are mainly modifications of Dai and Yuan’s for unconstrained optimization are proposed. Using the exact line search, the algorithm reduces to the Dai and Yuan conjugate gradient computational scheme. For inexact line search the algorithm satisfies the sufficient descent condition. Since the step lengths in conjugate gradient algorithms may differ from 1 by two orders of magnitude and tend to vary in a very unpredictable manner, the algorithms are equipped with an acceleration scheme able to improve the efficiency of the algorithms. Computational results for a set consisting of 750 unconstrained optimization test problems show that these new conjugate gradient algorithms substantially outperform the Dai–Yuan conjugate gradient algorithm and its hybrid variants, Hestenes–Stiefel, Polak–Ribière–Polyak, CONMIN conjugate gradient algorithms, limited quasi-Newton algorithm LBFGS and compare favorably with CG_DESCENT. In the frame of this numerical study the accelerated scaled memoryless BFGS preconditioned conjugate gradient ASCALCG algorithm proved to be more robust.
论文关键词:49M07,49M10,90C06,65K05,Unconstrained optimization,Conjugate gradient method,Sufficient descent condition,Conjugacy condition,Newton direction,Numerical comparisons
论文评审过程:Received 4 February 2009, Revised 3 May 2010, Available online 10 June 2010.
论文官网地址:https://doi.org/10.1016/j.cam.2010.05.002