A class of gradient unconstrained minimization algorithms with adaptive stepsize
作者:
Highlights:
•
摘要
In this paper the development, convergence theory and numerical testing of a class of gradient unconstrained minimization algorithms with adaptive stepsize are presented. The proposed class comprises four algorithms: the first two incorporate techniques for the adaptation of a common stepsize for all coordinate directions and the other two allow an individual adaptive stepsize along each coordinate direction. All the algorithms are computationally efficient and possess interesting convergence properties utilizing estimates of the Lipschitz constant that are obtained without additional function or gradient evaluations. The algorithms have been implemented and tested on some well-known test cases as well as on real-life artificial neural network applications and the results have been very satisfactory.
论文关键词:65K05,65K10,49D37,65C20,82C32,68T05,Unconstrained optimization,Steepest descent,Gradient method,Lipschitz constant,Line search strategies,Armijo's method,Globally convergent method,Artificial neural network,Training algorithm
论文评审过程:Received 16 April 1997, Revised 22 January 1999, Available online 24 January 2000.
论文官网地址:https://doi.org/10.1016/S0377-0427(99)00276-9