A pruning algorithm with relaxed conditions for high-order neural networks based on smoothing group L1/2 regularization and adaptive momentum

作者:

Highlights:

摘要

To enhance the sparseness of the network, improve its generalization ability and accelerate its training, we propose a novel pruning approach for sigma-pi-sigma neural network (SPSNN) under the relaxed condition by adding smoothing group L1/2 regularization and adaptive momentum. The main strength of this method is that it can prune both the redundant nodes between groups in the network, and also the redundant weights of the non-redundant nodes within the group, so as to achieve the sparseness of the network. Another strength is that the non-smooth absolute value function in the traditional L1/2 regularization method is replaced by a smooth function. This reduces the oscillations of learning and enables us to more effectively prove the convergence of the proposed algorithm. Finally, the numerical simulation results demonstrate the effectiveness of the proposed algorithm.

论文关键词:Sigma-pi-sigma neural network,Smoothing group L1/2 regularization,Adaptive momentum,Convergence

论文评审过程:Received 18 October 2021, Revised 30 August 2022, Accepted 31 August 2022, Available online 21 September 2022, Version of Record 28 September 2022.

论文官网地址:https://doi.org/10.1016/j.knosys.2022.109858