Training primal K-nearest neighbor based weighted twin support vector regression via unconstrained convex minimization

作者:Deepak Gupta

摘要

Recently, (Xu & Wang, Appl Intell 41(1):92–101, 2014) proposed a method called as K-nearest neighbor based weighted twin support vector regression (KNNWTSVR) to improve the prediction accuracy by using sample’s local information. A new variant of this approach named K-nearest neighbor based weighted twin support vector regression in primal as a pair of unconstrained minimization problems (KNNUPWTSVR) has been proposed in this paper which also reduces the effect of outliers. The solution of our proposed method is in primal space which has an approximate solution. It is well known that the approximate solution of the optimization problem in primal is always superior to its dual. The proposed KNNUPWTSVR is having continuous piece-wise quadratic objective functions which are solved by computing the zeros of the gradient. However, since the objective functions are having the non-smooth ‘plus’ function, therefore two approaches are suggested to solve the problems: i). by smooth approximation function which replaces the ‘plus’ function; ii). generalized derivative approach. To check the effectiveness of the proposed method, computational results of KNNUPWTSVR are obtained to compare with support vector regression (SVR), twin SVR (TSVR) and ε-twin SVR (ε-TSVR) on a number of synthetic datasets and real-world datasets. Our proposed method gives similar or better generalization performance with SVR, TSVR and ε-TSVR and also requires less computational time that clearly indicates its effectiveness and applicability.

论文关键词:Support vector regression, Twin support vector regression, K-nearest neighbor, Unconstrained minimization problems, Smooth approximation functions

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-017-0913-4