An efficient initialization mechanism of neurons for Winner Takes All Neural Network implemented in the CMOS technology

作者:

Highlights:

摘要

The paper presents a new initialization mechanism based on a Convex Combination Method (CCM) for Kohonen self-organizing Neural Networks (NNs) realized in the CMOS technology. A proper selection of initial values of the neuron weights exhibits a strong impact on the quality of the overall learning process. Unfortunately, in case of real input data, e.g. biomedical data, proper initialization is not easy to perform, as an exact data distribution is usually unknown. Bad initialization causes that even 70%–80% of neurons remain inactive, which increases the quantization error and thus limits the classification abilities of the NN. The proposed initialization algorithm has a couple of important advantages. Firstly, it does not require a knowledge of data distribution in the input data space. Secondly, there is no necessity for an initial polarization of the neuron weights before starting the learning process. This feature is very convenient in case of transistor level realizations. In this case the programming lines, which in other approaches occupy a large chip area, are not required. We proposed a modification of the original CCM algorithm. A new parameter which in the proposed analog CMOS realization is represented by an external current, allows to fit the behavior of the mechanism to NNs containing different numbers of neurons. The investigations show that the modified CCM operates properly for the NN containing even 250 neurons. A single CCM block realized in the CMOS 180 nm technology occupies an area of 300 μm2 and dissipates an average power of 20 μW and at data rate of up to 20 MHz.

论文关键词:Winner Takes All Neural Network,Initialization mechanism,Convex Combination Method,CMOS implementation,Analog circuits,Low energy consumption

论文评审过程:Available online 26 May 2015, Version of Record 20 September 2015.

论文官网地址:https://doi.org/10.1016/j.amc.2015.04.123