Adaptative time constants improve the prediction capability of recurrent neural networks

作者:Jean-Philippe Draye, Davor Pavisic, Guy Cheron, Gaëtan Libert

摘要

Classical statistical techniques for prediction reach their limitations in applications with nonlinearities in the data set; nevertheless, neural models can counteract these limitations. In this paper, we present a recurrent neural model where we associate an adaptative time constant to each neuron-like unit and a learning algorithm to train these dynamic recurrent networks. We test the network by training it to predict the Mackey-Glass chaotic signal. To evaluate the quality of the prediction, we computed the power spectra of the two signals and computed the associated fractional error. Results show that the introduction of adaptative time constants associated to each neuron of a recurrent network improves the quality of the prediction and the dynamical features of a neural model. The performance of such dynamic recurrent neural networks outperform time-delay neural networks.

论文关键词:Neural Network, Artificial Intelligence, Complex System, Power Spectrum, Nonlinear Dynamics

论文评审过程:

论文官网地址:https://doi.org/10.1007/BF02311573