Joint sparse principal component regression with robust property

作者:

Highlights:

摘要

Standard principal component regression (PCR) selects principal components without any considerations of response variable. Therefore, standard PCR cannot produce a sufficiently accurate model. Sparse principal component regression (SPCR) is a novel one-stage procedure that extracts principal components and constructs a linear regression model simultaneously. Since SPCR can be viewed as a combination of standard principal component regression and sparse principal component analysis (SPCA), it also inherits many drawbacks from them. Consequently, the absolute constraints and the least square loss function make SPCR lack of joint property and robustness. To tackle these problems, this paper proposes joint sparse principal component regression (JSPCR), which can select the important features jointly and estimate the coefficients robustly. Thus, JSPCR can have higher prediction accuracy. To encourage the sparsity of loading matrix, we apply a sparse group penalty and extend JSPCR to joint bi-level sparse PCR (JBSPCR). We derive an alternating optimization criterion and prove its convergence as well. The synthetical data simulations and the real data analysis powerfully support that JSPCR is superior to SPCR and other famous dimensionality reduction procedures or sparse techniques.

论文关键词:Dimension reduction,Joint property,Principal component regression,Robustness

论文评审过程:Received 22 April 2020, Revised 22 June 2021, Accepted 30 August 2021, Available online 10 September 2021, Version of Record 14 September 2021.

论文官网地址:https://doi.org/10.1016/j.eswa.2021.115845