Optimal reduction of solutions for support vector machines

作者:

Highlights:

摘要

Being a universal learning machine, a support vector machine (SVM) suffers from expensive computational cost in the test phase due to the large number of support vectors, and greatly impacts its practical use. To address this problem, we proposed an adaptive genetic algorithm to optimally reduce the solutions for an SVM by selecting vectors from the trained support vector solutions, such that the selected vectors best approximate the original discriminant function. Our method can be applied to SVMs using any general kernel. The size of the reduced set can be used adaptively based on the requirement of the tasks. As such the generalization/complexity trade-off can be controlled directly. The lower bound of the number of selected vectors required to recover the original discriminant function can also be determined.

论文关键词:Support vector machine,Vector correlation,Genetic algorithms,Optimal solution,Discriminant function,Pattern recognition

论文评审过程:Available online 12 April 2009.

论文官网地址:https://doi.org/10.1016/j.amc.2009.04.010