Improved GNN Models for Constant Matrix Inversion

作者:Predrag S. Stanimirović, Marko D. Petković

摘要

It was shown that the multiplication of the left hand side of the classical Zhang neural network design rule by an appropriate positive definite matrix generates a new neural design with improved convergence rate. Our intention is to apply similar principle on the standard gradient neural network (GNN) model. To that goal, we discover that some of proposed models can be considered as the multiplication of the right hand side of the GNN model by a symmetric positive-semidefinite matrix. As a final result, we propose appropriate general pattern to define various improvements of the standard GNN design for online real-time matrix inversion in time invariant case. The leading idea in generating improved models is initiated after a combination of two GNN patterns. Improved GNN (IGNN) design shows global exponential convergence with an improved convergence rate with respect to the convergence rate of the original GNN pattern. The acceleration in the convergence rate is defined by the smallest eigenvalue of appropriate positive semidefinite matrices. IGNN models are not only generalizations of original GNN models but they comprise so far defined improvements of the standard GNN design.

论文关键词:Gradient neural network, Zhang neural network, Matrix inverse, Dynamic equation, Activation function

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-019-10025-9