A new approach for training Lagrangian support vector regression

作者:S. Balasundaram, Yogendra Meena

摘要

In this paper, a novel root finding problem for the Lagrangian support vector regression in 2-norm (LSVR) is formulated in which the number of unknowns becomes the number of training examples. Further, it is proposed to solve it by functional iterative and Newton methods. Under sufficient conditions, we proved their linear rate of convergence. Experiments are performed on a number of synthetic and real-world benchmark datasets, and their results are compared with support vector regression (SVR) and its variants such as least squares SVR and LSVR. Similar generalization performance with improved or comparable learning speed to SVR and its variants demonstrates the usefulness of the proposed formulation solved by the iterative methods.

论文关键词:Absolute value equation, Functional iterative method , Generalized Newton method, Smooth approximation, Support vector regression

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10115-016-0928-x