SVM decision boundary based discriminative subspace induction

作者:

Highlights:

摘要

We study the problem of linear dimension reduction for classification, with a focus on sufficient dimension reduction, i.e., finding subspaces without loss of discrimination power. First, we formulate the concept of sufficient subspace for classification in parallel terms as for regression. Then we present a new method to estimate the smallest sufficient subspace based on an improvement of decision boundary analysis (DBA). The main idea is to combine DBA with support vector machines (SVM) to overcome the inherent difficulty of DBA in small sample size situations while keeping DBA's estimation simplicity. The compact representation of SVM boundary results in a significant gain in both speed and accuracy over previous DBA implementations. Alternatively, this technique can be viewed as a way to reduce the run-time complexity of SVM itself. Comparative experiments on one simulated and four real-world benchmark datasets highlight the superior performance of the proposed approach.

论文关键词:Dimensionality reduction,Linear dimension reduction,Sufficient dimension reduction,Intrinsic discriminative subspace (IDS),Decision boundary analysis (DBA),Support vector machines (SVM),Classification,Regression

论文评审过程:Received 15 October 2003, Revised 24 January 2005, Accepted 24 January 2005, Available online 9 April 2005.

论文官网地址:https://doi.org/10.1016/j.patcog.2005.01.016