Kernel-based regression via a novel robust loss function and iteratively reweighted least squares

作者:Hongwei Dong, Liming Yang

摘要

Least squares kernel-based methods have been widely used in regression problems due to the simple implementation and good generalization performance. Among them, least squares support vector regression (LS-SVR) and extreme learning machine (ELM) are popular techniques. However, the noise sensitivity is a major bottleneck. To address this issue, a generalized loss function, called \(\ell _s\)-loss, is proposed in this paper. With the support of novel loss function, two kernel-based regressors are constructed by replacing the \(\ell _2\)-loss in LS-SVR and ELM with the proposed \(\ell _s\)-loss for better noise robustness. Important properties of \(\ell _s\)-loss, including robustness, asymmetry and asymptotic approximation behaviors, are verified theoretically. Moreover, iteratively reweighted least squares are utilized to optimize and interpret the proposed methods from a weighted viewpoint. The convergence of the proposal is proved, and detailed analyses of robustness are given. Experiments on both artificial and benchmark datasets confirm the validity of the proposed methods.

论文关键词:Robust regression, Support vector machine, Extreme learning machine, Iteratively reweighted least squares

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10115-021-01554-8