A Dynamically Stabilized Recurrent Neural Network

作者:Samer Saab Jr., Yiwei Fu, Asok Ray, Michael Hauser

摘要

This work proposes a novel recurrent neural network architecture, called the Dynamically Stabilized Recurrent Neural Network (DSRNN). The developed DSRNN includes learnable skip-connections across a specified number of time-steps, which allows for a state-space representation of the network’s hidden-state trajectory, and a regularization term is introduced in the loss function in the setting of Lyapunov stability theory. The regularizer enables the placement of eigenvalues of the (linearized) transfer function matrix to desired locations in the complex plane, thereby acting as an internal controller for the hidden-state trajectories. In this way, the DSRNN adjusts the weights of temporal skip-connections to achieve recurrent hidden-state stability, which mitigates the problems of vanishing and exploding gradients. The efficacy of the DSRNN is demonstrated on a forecasting task of a recorded double-pendulum experimental model. The results show that the DSRNN outperforms both the Long Short-Term Memory (LSTM) and vanilla recurrent neural networks, and the relative mean-squared error of the LSTM is reduced by up to \(\sim \)99.64%. The DSRNN also showed comparable results to the LSTM on a classification task of two Lorenz oscillator systems.

论文关键词:Recurrent neural networks, Long short-term memory, Lyapunov stability

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-021-10676-7