Constrained RTRL To Reduce Learning Rate and Forgetting Phenomenon

作者:Fabrice Druaux, Emmanuel Rogue, Alain Faure

摘要

A fully connected continuous time recurrent neural network, trained by means of Real-Time Recurrent Learning, is investigated. A theoretical analysis of the output vector of the network during the training stage is performed. We point out the necessity to apply an additional constraint to the synaptic weight matrix with the intention of reducing the learning time while the forgetting is decreased. This constraint consists of updating the weights of the output cells using the output error gradient into the RTRL and a matrix of learning rates calculated from an average vector computed from the vectors previously memorized. For this first approach of the problem, only fixpoints attractors have been investigated. Some simple computational simulations validate the method.

论文关键词:constrained learning, forgetting, real time recurrent learning

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1009677128478