Self-centralized jointly sparse maximum margin criterion for robust dimensionality reduction

作者:

Highlights:

• To alleviate the deviation caused by outliers, we re-formulate the weighted MMC with L1,2-norm, and update the trusted global and intra-class centroids adaptively during the iterative solving process.

• The L2,1-norm sparsity is introduced into the newly formulated MMC for jointly feature selection and sparse subspace learning, which significantly reduces complexity and improves generalization for the model.

• A simple and efficient iterative algorithm is derived and its convergence is proved creatively and theoretically. Besides, comparative experiments demonstrate the effectiveness of the proposed method.

摘要

•To alleviate the deviation caused by outliers, we re-formulate the weighted MMC with L1,2-norm, and update the trusted global and intra-class centroids adaptively during the iterative solving process.•The L2,1-norm sparsity is introduced into the newly formulated MMC for jointly feature selection and sparse subspace learning, which significantly reduces complexity and improves generalization for the model.•A simple and efficient iterative algorithm is derived and its convergence is proved creatively and theoretically. Besides, comparative experiments demonstrate the effectiveness of the proposed method.

论文关键词:Maximum margin criterion,Robustness,Adaptive centroid,L2,1- and L1,2-norm sparsity,Dimensionality reduction

论文评审过程:Received 17 January 2020, Revised 4 June 2020, Accepted 28 July 2020, Available online 3 August 2020, Version of Record 6 August 2020.

论文官网地址:https://doi.org/10.1016/j.knosys.2020.106343