Two-stage optimal component analysis

作者:

Highlights:

摘要

Linear techniques are widely used to reduce the dimension of image representation spaces in applications such as image indexing and object recognition. Optimal Component Analysis (OCA) is a method that addresses the problem of learning an optimal linear representation for a particular classification task. The problem is formulated in the framework of optimization on a Grassmann manifold and treated with stochastic gradient methods intrinsic to the manifold. OCA has been successfully applied to image classification problems arising in a variety of contexts. However, as the search space is typically very high dimensional, OCA optimization often requires a large number of iterations, each involving extensive computations that make the algorithm somewhat costly to implement. In this paper, we propose a two-stage method, which we refer to as two-stage OCA, that improves the search efficiency by orders of magnitude without compromising the quality of the estimation. In fact, extensive experiments using face and object classification datasets indicate that the proposed method often leads to more accurate classification than the original OCA since it is not as prone to over-fitting. Two-stage OCA also leads to substantial improvement in classification performance as compared to other linear dimension reduction methods.

论文关键词:

论文评审过程:Received 26 September 2006, Accepted 30 April 2007, Available online 8 June 2007.

论文官网地址:https://doi.org/10.1016/j.cviu.2007.04.005