Least squares large margin distribution machine for regression

作者:Umesh Gupta, Deepak Gupta

摘要

Better prediction ability is the main objective of any regression-based model. Large margin Distribution Machine for Regression (LDMR) is an efficient approach where it tries to reduce both loss functions, i.e. ε-insensitive and quadratic loss to diminish the effects of outliers. However, still, it has a significant drawback, i.e. high computational complexity. To achieve the improved generalization of the regression model with less computational cost, we propose an enhanced form of LDMR named as Least Squares Large margin Distribution Machine-based Regression (LS-LDMR) by transforming the inequality conditions alleviate to equality conditions. The elucidation is attained by handling a system of linear equations where we need to measure the inverse of the matrix only. Hence, there is no need to solve the large size of the quadratic programming problem, unlike in the case of other regression-based algorithms as SVR, Twin SVR, and LDMR. The numerical experiment has been performed on the benchmark real-life datasets along with synthetically generated datasets by using the linear and Gaussian kernel. All the experiments of presented LS-LDMR are analyzed with standard SVR, Twin SVR, primal least squares Twin SVR (PLSTSVR), ε-Huber SVR (ε-HSVR), ε-support vector quantile regression (ε-SVQR), minimum deviation regression (MDR), and LDMR, which shows the effectiveness and usability of LS-LDMR. This approach is also statistically validated and verified in terms of various metrics.

论文关键词:Support vector regression, Twin support vector regression, Least squares method, Large margin distribution machine

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10489-020-02166-5