A simple three-term conjugate gradient algorithm for unconstrained optimization
作者:
Highlights:
•
摘要
A simple three-term conjugate gradient algorithm which satisfies both the descent condition and the conjugacy condition is presented. This algorithm is a modification of the Hestenes and Stiefel algorithm (Hestenes and Stiefel, 1952) [10], or that of Hager and Zhang (Hager and Zhang, 2005) [23] in such a way that the search direction is descent and it satisfies the conjugacy condition. These properties are independent of the line search. Also, the algorithm could be considered as a modification of the memoryless BFGS quasi-Newton method. The new approximation of the minimum is obtained by the general Wolfe line search, now using a standard acceleration technique developed by Andrei (2009) [27]. For uniformly convex functions, under standard assumptions, the global convergence of the algorithm is proved. Numerical comparisons of the suggested three-term conjugate gradient algorithm versus six other three-term conjugate gradient algorithms, using a set of 750 unconstrained optimization problems, show that all these computational schemes have similar performances, the suggested one being slightly faster and more robust. The proposed three-term conjugate gradient algorithm substantially outperforms the well-known Hestenes and Stiefel conjugate gradient algorithm, as well as the more elaborate CG_DESCENT algorithm. Using five applications from the MINPACK-2 test problem collection (Averick et al., 1992) [25], with 106 variables, we show that the suggested three-term conjugate gradient algorithm is the top performer versus CG_DESCENT.
论文关键词:Unconstrained optimization,Three-term conjugate gradient,Descent condition,Conjugacy condition,Numerical comparisons
论文评审过程:Received 8 March 2012, Revised 28 September 2012, Available online 6 October 2012.
论文官网地址:https://doi.org/10.1016/j.cam.2012.10.002