Global Output Convergence of a Class of Recurrent Delayed Neural Networks with Discontinuous Neuron Activations

作者:Zhenyuan Guo, Lihong Huang

摘要

This paper studies the global output convergence of a class of recurrent delayed neural networks with time-varying inputs. We consider non-decreasing activations which may also have jump discontinuities in order to model the ideal situation where the gain of the neuron amplifiers is very high and tends to infinity. In particular, we drop the assumptions of Lipschitz continuity and boundedness on the activation functions, which are usually required in most of the existing works. Due to the possible discontinuities of the activations functions, we introduce a suitable notation of limit to study the convergence of the output of the recurrent delayed neural networks. Under suitable assumptions on the interconnection matrices and the time-varying inputs, we establish a sufficient condition for global output convergence of this class of neural networks. The convergence results are useful in solving some optimization problems and in the design of recurrent delayed neural networks with discontinuous neuron activations.

论文关键词:Recurrent delayed neural networks, Global output convergence, Discontinuous neuron activations, Time-varying inputs

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-009-9119-z