Robust high dimensional expectation maximization algorithm via trimmed hard thresholding

作者:Di Wang, Xiangyu Guo, Shi Li, Jinhui Xu

摘要

In this paper, we study the problem of estimating latent variable models with arbitrarily corrupted samples in high dimensional space (i.e., \(d\gg n\)) where the underlying parameter is assumed to be sparse. Specifically, we propose a method called Trimmed (Gradient) Expectation Maximization which adds a trimming gradients step and a hard thresholding step to the Expectation step (E-step) and the Maximization step (M-step), respectively. We show that under some mild assumptions and with an appropriate initialization, the algorithm is corruption-proofing and converges to the (near) optimal statistical rate geometrically when the fraction of the corrupted samples \(\epsilon\) is bounded by \({\tilde{O}}\bigg (\frac{1}{\sqrt{n}}\bigg )\). Moreover, we apply our general framework to three canonical models: mixture of Gaussians, mixture of regressions and linear regression with missing covariates. Our theory is supported by thorough numerical results.

论文关键词:Robust statistics, High dimensional statistics, Gaussian mixture model, Expectation maximixation, Iterative hard thresholding

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10994-020-05926-z