Minimum deviation distribution machine for large scale regression

作者:

Highlights:

摘要

In this paper, by introducing the statistics of training data into support vector regression (SVR), we propose a minimum deviation distribution regression (MDR). Rather than just minimizing the structural risk, MDR also minimizes both the regression deviation mean and the regression deviation variance, which is able to deal with the different distribution of boundary data and noises. The formulation of minimizing the first and second order statistics in MDR leads to a strongly convex quadratic programming problem (QPP). An efficient dual coordinate descend algorithm is adopted for small sample problem, and an average stochastic gradient algorithm for large scale one. Both theoretical analysis and experimental results illustrate the efficiency and effectiveness of the proposed method.

论文关键词:Regression,Support vector machine,Minimum deviation distribution machine,Dual coordinate descend algorithm,Stochastic gradient algorithm

论文评审过程:Received 23 March 2017, Revised 26 January 2018, Accepted 1 February 2018, Available online 9 February 2018, Version of Record 28 February 2018.

论文官网地址:https://doi.org/10.1016/j.knosys.2018.02.002