Enhancing batch normalized convolutional networks using displaced rectifier linear units: A systematic comparative study

作者:

Highlights:

• Enhanced nonlinearities may improve expert systems performance.

• Proposal of the activation function DReLU.

• DReLU presents the best training speed in all cases.

• DReLU enhances the ReLU performance in all scenarios.

• DReLU provides the best test accuracy in almost all experiments.

摘要

•Enhanced nonlinearities may improve expert systems performance.•Proposal of the activation function DReLU.•DReLU presents the best training speed in all cases.•DReLU enhances the ReLU performance in all scenarios.•DReLU provides the best test accuracy in almost all experiments.

论文关键词:DReLU,Activation function,Batch normalization,Comparative study,Convolutional Neural Networks,Deep learning

论文评审过程:Received 13 June 2018, Revised 3 January 2019, Accepted 27 January 2019, Available online 29 January 2019, Version of Record 1 February 2019.

论文官网地址:https://doi.org/10.1016/j.eswa.2019.01.066