Inverse matrix-free incremental proximal support vector machine

作者:

Highlights:

摘要

Traditional Support Vector Machines (SVMs) based learners are commonly regarded as strong classifiers for many learning tasks. Their efficiency for large-scale high dimensional data, however, has shown to be unsatisfactory. Consequently, many alternative SVM solutions exist for large-scale and/or high dimensional data. Among them, proximal support vector machine (PSVM) is a simple but effective SVM classifier. Its incremental version (ISVM) is also available for large-scale data. Nevertheless, the computational efficiency of the ISVM for high dimensional data still needs to be improved, mainly because it requires explicit matrix inversion for updating the decision model. To solve this problem, we propose, in this paper, an inverse matrix-free incremental PSVM (IMISVM) with the following two characteristics. Firstly, IMISVM avoids explicit matrix inversion and hence derives simple formulas for updating model parameters. Secondly, IMISVM achieves faster convergence speed than ISVM. Experimental results on synthetic and real-world data sets confirm that the proposed incremental classifier outperforms ISVM.

论文关键词:Incremental learning,Incremental proximal support vector machine,High dimensionality,Large scale

论文评审过程:Received 16 February 2011, Revised 8 February 2012, Accepted 14 February 2012, Available online 21 February 2012.

论文官网地址:https://doi.org/10.1016/j.dss.2012.02.007