Deep learning with adaptive learning rate using laplacian score

作者:

Highlights:

• Adaptive learning rate has been proposed for Deep Learning in MLP.

• Technique for updating learning rate is free of hyper-parameters.

• Learning rate is a function of a parameter called learning parameter.

• Learning parameter is updated based on error gradient.

• Learning rate is further updated based on the Laplacian score of activation values.

摘要

•Adaptive learning rate has been proposed for Deep Learning in MLP.•Technique for updating learning rate is free of hyper-parameters.•Learning rate is a function of a parameter called learning parameter.•Learning parameter is updated based on error gradient.•Learning rate is further updated based on the Laplacian score of activation values.

论文关键词:Adaptive learning rate,Deep learning,Gradient descent,Laplacian score

论文评审过程:Received 28 September 2015, Revised 12 May 2016, Accepted 13 May 2016, Available online 16 May 2016, Version of Record 29 June 2016.

论文官网地址:https://doi.org/10.1016/j.eswa.2016.05.022