Quadratic optimization fine tuning for the Support Vector Machines learning phase

作者:

Highlights:

摘要

This work presents a comparative analysis of specific, rather than general, mathematical programming implementation techniques of the quadratic optimization problem (QP) based on Support Vector Machines (SVM) learning process. Considering the Karush–Kuhn–Tucker (KKT) optimality conditions, we present a strategy of implementation of the SVM-QP following three classical approaches: (i) active set, also divided in primal and dual spaces, methods, (ii) interior point methods and (iii) linearization strategies. We also present the general extension to treat large-scale applications consisting in a general decomposition of the QP problem into smaller ones, conserving the exact solution approach. In the same manner, we propose a set of heuristics to take into account for a better than a random selection process for the initialization of the decomposition strategy. We compare the performances of the optimization strategies using some well-known benchmark databases.

论文关键词:Support vector machines,Quadratic optimization,Decomposition,Initialization strategies

论文评审过程:Available online 22 August 2013.

论文官网地址:https://doi.org/10.1016/j.eswa.2013.08.019