Joint Learning of Unsupervised Dimensionality Reduction and Gaussian Mixture Model

作者:Xi Yang, Kaizhu Huang, John Yannis Goulermas, Rui Zhang

摘要

Dimensionality reduction (DR) has been one central research topic in information theory, pattern recognition, and machine learning. Apparently, the performance of many learning models significantly rely on dimensionality reduction: successful DR can largely improve various approaches in clustering and classification, while inappropriate DR may deteriorate the systems. When applied on high-dimensional data, some existing research approaches often try to reduce the dimensionality first, and then input the reduced features to other available models, e.g., Gaussian mixture model (GMM). Such independent learning could however significantly limit the performance, since the optimal subspace given by a particular DR approach may not be appropriate for the following model. In this paper, we focus on investigating how unsupervised dimensionality reduction could be performed together with GMM and if such joint learning could lead to improvement in comparison with the traditional unsupervised method. In particular, we engage the mixture of factor analyzers with the assumption that a common factor loading exists for all the components. Based on that, we then present EM-algorithm that converges to a local optimal solution. Such setting exactly optimizes a dimensionality reduction together with the parameters of GMM. We describe the framework, detail the algorithm, and conduct a series of experiments to validate the effectiveness of our proposed approach. Specifically, we compare the proposed joint learning approach with two competitive algorithms on one synthetic and six real data sets. Experimental results show that the joint learning significantly outperforms the comparison methods in terms of three criteria.

论文关键词:Dimensionality reduction, Gaussian mixture model, Factor analyzers, Unsupervised learning

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-016-9508-z