Global Stability of a General Class of Discrete-Time Recurrent Neural Networks

作者:Zhigang Zeng, De-Shuang Huang, Zengfu Wang

摘要

A general class of discrete-time recurrent neural networks (DTRNNs) is formulated and studied in this paper. Several sufficient conditions are obtained to ensure the global stability of DTRNNs with delays based on induction principle (not based on the well-known Liapunov methods). The obtained results have neither assumed the symmetry of the connection matrix, nor boundedness, monotonicity or the differentiability of the activation functions. In addition, discrete-time analogues of a general class of continuous-time recurrent neural networks (CTRNNs) are derived and studied. The convergence characteristics of CTRNNs are preserved by the discrete-time analogues without any restriction imposed on the uniform discretization step size. Finally, the simulating results demonstrate the validity and feasibility of our proposed approach.

论文关键词:delay, discrete-time, global stability, induction principle, recurrent neural networks

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-004-8194-4