An Efficient SMO Algorithm for Solving Non-smooth Problem Arising in \(\varepsilon \)-Insensitive Support Vector Regression

作者:Aykut Kocaoğlu

摘要

Classical support vector regression (C-SVR) is a powerful function approximation method, which is robust against noise and performs a good generalization, since it is formulated by a regularized error function employing the \(\varepsilon \)-insensitiveness property. To exploit the kernel trick, C-SVR generally solves the Lagrangian dual problem. In this paper, an efficient sequential minimal optimization (SMO) algorithm with a novel easy to compute working set selection (WSS) based on the minimization of an upper bound on the difference between consecutive loss function values for solving a convex non-smooth dual optimization problem obtained by reformulating the dual problem of C-SVR with \(l_2\) error loss function which is equivalent to the \(\varepsilon \)-insensitive version of the LSSVR, is proposed. The asymptotic convergence to the optimum of the proposed SMO algorithm is also proved. This proposed SMO algorithm for solving non-smooth problem comprises both SMO algorithms for solving LSSVR and C-SVR. Indeed, it becomes equivalent to the SMO algorithm with second-order WSS for solving LSSVR when \(\varepsilon =0\). The proposed algorithm has the advantage of dealing with the optimization variables half the number of the ones in C-SVR, which results in lesser number of kernel related matrix evaluations than the standard SMO algorithm developed for C-SVR and improves the probability of the matrix outputs to have been precomputed and cached. Therefore, the proposed SMO algorithm results better training time than the standard SMO algorithm for solving C-SVR, especially with caching process. Moreover, the superiority of the proposed WSS over its first-order counterpart for solving the non-smooth optimization problem is presented.

论文关键词:Sequential minimal optimization, Support vector regression, Non-smooth optimization, Working set selection, Least squares support vector regression

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-018-09975-3