Online learning with kernel regularized least mean square algorithms

作者:

Highlights:

摘要

In this paper, we propose a novel type of kernel least mean square algorithm with regularized structural risk for online learning. In order to curb the continuous growing of kernel functions, a new dictionary selection method based on the cumulative coherence measure is applied to perform the sparsification procedure, which can obtain a dictionary with diagonally dominant Gram matrix under certain conditions. On the updating of the kernel weight, the linear least mean square algorithm is generalized into the reproducing kernel Hilbert space (RKHS) with minimized updating structural risk and it results in a kernel regularized least mean square (KRLMS) algorithm. A simplified version of the KRLMS algorithm is also presented by applying only partial updating information to train the algorithm at each iteration, which reduces the computational complexity. Theoretical analysis of their convergence issues is examined and variable learning rates are adopted in the training process which can guarantee the weight convergence of the algorithm in terms of a bounded measurement error. Several experiments are carried out to prove the effectiveness of the proposed algorithm for online learning compared to some existing kernel algorithms.

论文关键词:Kernel method,Dictionary,Cumulative coherence,Diagonally dominant,Weight convergence

论文评审过程:Received 30 March 2013, Revised 31 January 2014, Accepted 1 February 2014, Available online 8 February 2014.

论文官网地址:https://doi.org/10.1016/j.knosys.2014.02.005