Linear support vector regression with linear constraints

作者:Quentin Klopfenstein, Samuel Vaiter

摘要

This paper studies the addition of linear constraints to the Support Vector Regression when the kernel is linear. Adding those constraints into the problem allows to add prior knowledge on the estimator obtained, such as finding positive vector, probability vector or monotone data. We prove that the related optimization problem stays a semi-definite quadratic problem. We also propose a generalization of the Sequential Minimal Optimization algorithm for solving the optimization problem with linear constraints and prove its convergence. We show that an efficient generalization of this iterative algorithm with closed-form updates can be used to obtain the solution of the underlying optimization problem. Then, practical performances of this estimator are shown on simulated and real datasets with different settings: non negative regression, regression onto the simplex for biomedical data and isotonic regression for weather forecast. These experiments show the usefulness of this estimator in comparison to more classical approaches.

论文关键词:Support vector machine, Support vector regression, Sequential minimal optimization, Coordinate descent, Constrained linear regression

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10994-021-06018-2