State Preserving Extreme Learning Machine: A Monotonically Increasing Learning Approach

作者:Md. Zahangir Alom, Paheding Sidike, Tarek M. Taha, Vijayan K. Asari

摘要

Extreme Learning Machines (ELM) has been introduced as a new algorithm for training single hidden layer feedforward neural networks instead of the classical gradient-based approaches. Based on the consistency property of data, which enforces similar samples to share similar properties, ELM is a biologically inspired learning algorithm that learns much faster with good generalization and performs well in classification tasks. However, the stochastic characteristics of hidden layer outputs from the random generation of the weight matrix in current ELMs leads to the possibility of unstable outputs in the learning and testing phases. This is detrimental to the overall performance when many repeated trials are conducted. To cope with this issue, we present a new ELM approach, named State Preserving Extreme Leaning Machine (SPELM). SPELM ensures the overall training and testing performance of the classical ELM while monotonically increases its accuracy by preserving state variables. For evaluation, experiments are performed on different benchmark datasets including applications in face recognition, pedestrian detection, and network intrusion detection for cyber security. Several popular feature extraction techniques, namely Gabor, pyramid histogram of oriented gradients, and local binary pattern are also incorporated with SPELM. Experimental results show that our SPELM algorithm yields the best performance on tested data over ELM and RELM.

论文关键词:Extreme learning machine (ELM), Face recognition, Pedestrian detection, Intrusion detection, Feature extraction, State preserving ELM

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-016-9552-8