An incremental learning preprocessor for feed-forward neural network

作者:Piyabute Fuangkhon

摘要

Outpost Vector model synthesizes new vectors from two classes of data at their boundary to maintain the shape of the current system in order to increase the level of accuracy of classification. This paper presents an incremental learning preprocessor for Feed-forward Neural Network (FFNN) which utilizes Outpost Vector model to improve the level of accuracy of classification of both new data and old data. The preprocessor generates outpost vectors from selected new samples, selected prior samples, both samples, or generates no outpost vector at all. After that, they are included in the final training set, as well as selected new samples and selected prior samples, based on the specified parameters. The final training set is then trained with FFNN. The whole process is repeated again when new samples are sufficiently collected in order to learn newer knowledge. The experiments are conducted with a 2-dimension partition problem. The distribution of training and test samples is created in a limited location of a 2-dimension donut ring. The context of the problem is assumed to shift 45° in counterclockwise direction. There are two classes of data which are represented as 0 and 1. Every consecutive partition is set to have different class of both new data and old data. The experimental results show that the use of outpost vectors generated from either selected new samples or selected prior or both samples helps improve the level of accuracy of classification for all data. The run-time complexity of the algorithm presents that the overhead from outpost vector generation process is insignificant and is compensated by the improved level of accuracy of classification.

论文关键词:Contour preserving classification, Incremental learning, Learning in adaptive environment, Neural network, Outpost vector

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10462-011-9304-0