Deep belief networks with self-adaptive sparsity

作者:Chen Qiao, Lan Yang, Yan Shi, Hanfeng Fang, Yanmei Kang

摘要

To have the sparsity of deep neural networks is crucial, which can improve the learning ability of them, especially for application to high-dimensional data with small sample size. Commonly used regularization terms for keeping the sparsity of deep neural networks are based on L1-norm or L2-norm; however, they are not the most reasonable substitutes of L0-norm. In this paper, based on the fact that the minimization of a log-sum function is one effective approximation to that of L0-norm, the sparse penalty term on the connection weights with the log-sum function is introduced. By embedding the corresponding iterative re-weighted-L1 minimization algorithm with k-step contrastive divergence, the connections of deep belief networks can be updated in a way of sparse self-adaption. Experiments on two kinds of biomedical datasets which are two typical small sample size datasets with a large number of variables, i.e., brain functional magnetic resonance imaging data and single nucleotide polymorphism data, show that the proposed deep belief networks with self-adaptive sparsity can learn the layer-wise sparse features effectively. And results demonstrate better performances including the identification accuracy and sparsity capability than several typical learning machines.

论文关键词:Deep belief networks, Iterative re-weighted-L 1 minimization algorithm, Self-adaptive sparsity, Contrastive divergence algorithm, Biomedical data

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-021-02361-y