A Sigma-Pi-Sigma Neural Network (SPSNN)

作者:Chien-Kuo Li

摘要

This letter presents a sigma-pi-sigma neural network (SPSNN) structure. The SPSNN can learn to implement static mapping that multilayer neural networks and radial basis function networks usually do. The output of the SPSNN has the sum of product-of-sum form \(\sum\nolimits_{n = 1}^{\text{K}} {\prod\nolimits_{i = 1}^n {\sum\nolimits_{j = 1}^{N_\upsilon } {f_{nij\left( {x_j } \right)} } } }\), where x j's are inputs, N v is the number of inputs, f nij() is a function to be generated through the network training, and K is the number of pi-sigma network (PSN) which is the basic building block for SPSNN. A linear memory array can be used to implement f nij (). The function f nij (x j ) can be expressed as \(\sum\nolimits_{k = 1}^{N_q + N_e - 1} {w_{nijk} B_{ijk} \left( {x_j } \right)}\), where B ijk() is a single-variable basis function, w nijk's are weight values stored in memory, N q is the quantized element number for x j , and N e is the number of basis functions in the neighborhood used for storing information for x j. If all B ijk()'s are Gaussian functions, the new neural network degenerates to a Gaussian function network. This paper focuses on the use of overlapped rectangular pulses as the basis functions. With such basis functions, \(w_{nijk} B_{ijk} \left( {x_j } \right)\) will equal either zero or w nijk, and the computation of f nij (x j) becomes a simple addition of retrieved w nijk's. The new neural network structure demonstrates excellent learning convergence characteristics and requires small memory space. It has merits over multilayer neural networks, radial basis function networks and CMAC.

论文关键词:function approximation, memory-based neural network, ridge polynomial network, self-generated basis function, sigma-pi-sigma neural network

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1022967523886