An efficient implicit regularized Lagrangian twin support vector regression

作者:M. Tanveer, K. Shubham, M. Aldhaifallah, K. S. Nisar

摘要

Twin support vector regression (TSVR) and Lagrangian TSVR (LTSVR) satisfy only empirical risk minimization principle. Moreover, the matrices in their formulations are always positive semi-definite. To overcome these problems, we propose an efficient implicit Lagrangian formulation for the dual regularized twin support vector regression, called IRLTSVR for short. By introducing a regularization term to each objective function, the optimization problems in our IRLTSVR are positive definite and implement the structural risk minimization principle. Moreover, the 1-norm of the vector of slack variable is replaced with 2-norm to make the objective functions strongly convex. Our IRLTSVR solves two systems of linear equations instead of solving two quadratic programming problems (QPPs) in TSVR and one large QPP in SVR, which makes the learning speed of IRLTSVR faster than TSVR and SVR. Particularly, we compare three implementations of IRLTSVR with existing approaches. Computational results on several synthetic and real-world benchmark datasets clearly indicate the effectiveness and applicability of the IRLTSVR in comparison to SVR, TSVR and LTSVR.

论文关键词:Machine learning, Lagrangian support vector machines, Twin support vector regression, Implicit method, Smoothing technique

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-015-0728-0