RSigELU: A nonlinear activation function for deep neural networks

作者:

Highlights:

• Novel RSigELU activation functions, such as RSigELUS and RSigELUD, which are a combination of RELU, sigmoid, and ELU activation functions, were proposed.

• The proposed RSigELU activation functions are capable of working in positive, negative, and linear activation regions, and overcomes the vanishing gradient problem and negative region problem.

• Performance evaluation of the proposed activation functions were carried out using a VGG architecture on MNIST, Fashion MNIST, IMDb Movie, and CIFAR-10 benchmark datasets.

摘要

•Novel RSigELU activation functions, such as RSigELUS and RSigELUD, which are a combination of RELU, sigmoid, and ELU activation functions, were proposed.•The proposed RSigELU activation functions are capable of working in positive, negative, and linear activation regions, and overcomes the vanishing gradient problem and negative region problem.•Performance evaluation of the proposed activation functions were carried out using a VGG architecture on MNIST, Fashion MNIST, IMDb Movie, and CIFAR-10 benchmark datasets.

论文关键词:Deep learning,Nonlinear activation function,Convolutional neural networks,RSigELU

论文评审过程:Received 10 April 2020, Revised 21 December 2020, Accepted 27 February 2021, Available online 3 March 2021, Version of Record 11 March 2021.

论文官网地址:https://doi.org/10.1016/j.eswa.2021.114805