Feature space locality constraint for kernel based nonlinear discriminant analysis

作者:

Highlights:

摘要

Subspace learning is an important approach in pattern recognition. Nonlinear discriminant analysis (NDA), due to its capability of describing nonlinear manifold structure of samples, is considered to be more powerful to undertake classification tasks in image related problems. In kernel based NDA representation, there are three spaces involved, i.e., original data space, implicitly mapped high dimension feature space and the target low dimension subspace. Existing methods mainly focus on the information in original data space to find the most discriminant low dimension subspace. The implicit high dimension feature space plays a role that connects the original space and the target subspace to realize the nonlinear dimension reduction, but the sample geometric structure information in feature space is not involved. In this work, we try to utilize and explore this information. Specifically, the locality information of samples in feature space is modeled and integrated into the traditional kernel based NDA methods. In this way, both the sample distributions in original data space and the mapped high dimension feature space are modeled and more information is expected to be explored to improve the discriminative ability of the subspace. Two algorithms, named FSLC-KDA and FSLC-KSR, are presented. Extensive experiments on ORL, Extended-YaleB, PIE, Multi-PIE and FRGC databases validate the efficacy of the proposed method.

论文关键词:Locality constraint,Feature space,Nonlinear discriminant analysis,Face recognition

论文评审过程:Received 16 May 2011, Revised 26 October 2011, Accepted 12 January 2012, Available online 21 January 2012.

论文官网地址:https://doi.org/10.1016/j.patcog.2012.01.012