Joint low-rank representation and spectral regression for robust subspace learning

作者:

Highlights:

摘要

Subspace learning represents a group of algorithms to project high dimensional data onto a low dimensional subspace in order to retain the desirable properties and simultaneously reduce the dimensionality. In graph-based subspace learning algorithms, the quality of graph affects the performance of projection matrix learning a lot. Especially when data is noisy or even grossly corrupted, we cannot guarantee that the constructed graph can accurately depict the inner structure of data. Additionally, the widely used two-stage paradigm of learning the projection matrix on a given graph isolates the connection between these two stages. In this paper, we propose a general framework to get rid of these disadvantages by the inspiration of low-rank matrix recovery and spectral regression-based subspace learning. Concretely, by jointly optimizing the objectives of low-rank representation and spectral regression, on one hand we can automatically get the recovered data from noise and the graph affinity matrix and on the other hand we can perform the projection matrix learning. The formulated joint low-rank and subspace learning (JLRSL) framework can be efficiently optimized by the augmented Lagrange multiplier method. We evaluate the performance of JLRSL by conducting extensive experiments on representative benchmark data sets and the results show that the low-rank learning can greatly facilitate the process of subspace learning, leading to robust feature extraction. Moreover, comparison with the state-of-the-arts is performed.

论文关键词:Low-rank representation,Subspace learning,Spectral regression,Joint learning,Robustness

论文评审过程:Received 22 October 2019, Revised 7 February 2020, Accepted 1 March 2020, Available online 3 March 2020, Version of Record 4 April 2020.

论文官网地址:https://doi.org/10.1016/j.knosys.2020.105723