Truncation: A New Approach to Neural Network Reduction

作者:Alexey A. Nevzorov, Sergey V. Perchenko, Dmitry A. Stankevich

摘要

In this manuscript, the method for optimizing the number of neurons in the hidden layer of multilayer neural network is proposed. The described method is similar to dropout: we exclude some neurons during training, but the probability of neuron exclusion depends on its position in the layer. In the result of training the neurons at beginning of layer have a stronger connections with the next layer and they make the main contribution to the result. Neurons at the end of layer have weak connections to the next layer and can be excluded for further application of the network. On the examples of fully-connected and convolutional neural networks with one and multiple hidden layers we show, that proposed method allow us to obtain dependence of network accuracy on the number of neurons in layer after one cycle of training and also performs some regularization. For example, we applied the method to a four-layer convolutional network and reduced the layer sizes from (50, 50, 100, 100) to (36, 36, 74, 56) without losing accuracy.

论文关键词:Neurons reduction, Regularization, Classification, Autoencoder

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-021-10638-z