A new class of nonmonotone conjugate gradient training algorithms

作者:

Highlights:

摘要

In this paper, we propose a new class of conjugate gradient algorithms for training neural networks which is based on a new modified nonmonotone scheme proposed by Shi and Wang (2011). The utilization of a nonmonotone strategy enables the training algorithm to overcome the case where the sequence of iterates runs into the bottom of a curved narrow valley, a common occurrence in neural network training process. Our proposed class of methods ensures sufficient descent, avoiding thereby the usual inefficient restarts and it is globally convergent under mild conditions. Our experimental results provide evidence that the proposed nonmonotone conjugate gradient training methods are efficient, outperforming classical methods, proving more stable, efficient and reliable learning.

论文关键词:Artificial neural networks,Conjugate gradient algorithm,Nonmonotone line search,Global convergence

论文评审过程:Received 27 April 2014, Accepted 6 May 2015, Available online 10 June 2015, Version of Record 10 June 2015.

论文官网地址:https://doi.org/10.1016/j.amc.2015.05.053