On the learning dynamics of two-layer quadratic neural networks for understanding deep learning

作者:Zhenghao Tan, Songcan Chen

摘要

Deep learning performs as a powerful paradigm in many real-world applications; however, its mechanism remains much of a mystery. To gain insights about nonlinear hierarchical deep networks, we theoretically describe the coupled nonlinear learning dynamic of the two-layer neural network with quadratic activations, extending existing results from the linear case. The quadratic activation, although rarely used in practice, shares convexity with the widely used ReLU activation, thus producing similar dynamics. In this work, we focus on the case of a canonical regression problem under the standard normal distribution and use a coupled dynamical system to mimic the gradient descent method in the sense of a continuous-time limit, then use the high order moment tensor of the normal distribution to simplify these ordinary differential equations. The simplified system yields unexpected fixed points. The existence of these non-global-optimal stable points leads to the existence of saddle points in the loss surface of the quadratic networks. Our analysis shows there are conserved quantities during the training of the quadratic networks. Such quantities might result in a failed learning process if the network is initialized improperly. Finally, We illustrate the comparison between the numerical learning curves and the theoretical one, which reveals the two alternately appearing stages of the learning process.

论文关键词:learning dynamic, quadratic network, ordinary differential equations

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11704-020-0298-0