Double fused Lasso penalized LAD for matrix regression

作者:

Highlights:

摘要

More complex data are generated with a response on vector and matrix predictors in statistics and machine learning. Recently, Zhou and Li (2014) proposed matrix regression based on least squares (LS) method but they mainly considered the regularized matrix regression with nuclear norm penalty when the distribution of noise is with mean 0 and covariance being fixed. In practice, noises may be heavy-tailed or the distribution is unknown. In this case, it is well known that least absolute deviation (LAD) method yields better performances than LS method. Considering structures of predictors, we propose the double fused Lasso penalized LAD for matrix regression in this paper. The new penalty term combines fused Lasso and matrix-type fused Lasso. We achieve the strong duality theorem between the double fused Lasso penalized LAD and its dual. Based on it, we design a highly scalable symmetric Gauss–Seidel based Alternating Direction Method of Multipliers (sGS-ADMM) algorithm to solve the dual problem. Moreover, we give the global convergence and Q-linear rate of convergence. Finally, effectiveness of our method is demonstrated by numerical experiments on simulation and real datasets.

论文关键词:Matrix regression,Double fused Lasso,LAD,sGS-ADMM,Q-linear rate of convergence

论文评审过程:Received 20 November 2018, Accepted 25 March 2019, Available online 9 April 2019, Version of Record 9 April 2019.

论文官网地址:https://doi.org/10.1016/j.amc.2019.03.051