A note on the global convergence theorem of accelerated adaptive Perry conjugate gradient methods
作者:
Highlights:
•
摘要
In Andrei (2017), a class of efficient conjugate gradient algorithms (ACGSSV) is proposed for solving large-scale unconstrained optimization problems. However, due to a wrong inequality and an incorrect reasoning used in analyzing the global convergence property for the proposed algorithm, the proof of Theorem 4.2, the global convergence theorem, is incorrect. In this paper, the necessary corrections are made. Under common assumptions, it is shown that Algorithm ACGSSV converges linearly to the unique minimizer.
论文关键词:90C30,65K05,49M37,Unconstrained optimization,Conjugate gradient algorithm,Accelerated scheme,Self-scaling memoryless BFGS update,Convergence analysis
论文评审过程:Received 23 July 2017, Revised 13 October 2017, Accepted 16 October 2017, Available online 3 November 2017, Version of Record 3 November 2017.
论文官网地址:https://doi.org/10.1016/j.cam.2017.10.024