Infinite max-margin factor analysis via data augmentation

作者:

Highlights:

• Jointly learning FA and SVM, MMFA is proposed to get a discriminative subspace.

• Clustering the dataset in the subspace by DPM, MMFA is extended to iMMFA.

• Thanks to the jointly learning framework, they gain good prediction performance.

• Having the data description ability, the proposed models can reject outlier samples.

• In Bayesian framework, parameters can be inferred efficiently by the Gibbs sampler.

摘要

•Jointly learning FA and SVM, MMFA is proposed to get a discriminative subspace.•Clustering the dataset in the subspace by DPM, MMFA is extended to iMMFA.•Thanks to the jointly learning framework, they gain good prediction performance.•Having the data description ability, the proposed models can reject outlier samples.•In Bayesian framework, parameters can be inferred efficiently by the Gibbs sampler.

论文关键词:Latent variable support vector machine,Factor analysis,Dirichlet process mixture,Classification and rejection performance

论文评审过程:Received 20 September 2014, Revised 2 September 2015, Accepted 29 October 2015, Available online 10 November 2015, Version of Record 24 December 2015.

论文官网地址:https://doi.org/10.1016/j.patcog.2015.10.020