A globally convergent approximate Newton method for non-convex sparse learning

作者:

Highlights:

• We propose a globally convergent approximate Newton (GCAN) method for cardinality-constrained sparse learning of linear model. We analyze theoretically that our method can converge asymptotically. Additionally, the sparse recovery performance under proper condition is also analyzed and we find that our method has a boarder condition to guarantee the sparsity recovery performance.

• The Hessian matrix and its inverse are unnecessary to access compared to other second-order algorithms, that is to say, our method can be applied to the linear models with complex Hessian matrix (e.g. Softmax regression) or ones without Hessian matrix (e.g. L2-SVMs).

• We have applied our method to several linear models. The experimental results demonstrate that our proposed algorithm is superior to the first-order algorithms and has a better trade-off between accuracy and efficiency than other methods.

摘要

•We propose a globally convergent approximate Newton (GCAN) method for cardinality-constrained sparse learning of linear model. We analyze theoretically that our method can converge asymptotically. Additionally, the sparse recovery performance under proper condition is also analyzed and we find that our method has a boarder condition to guarantee the sparsity recovery performance.•The Hessian matrix and its inverse are unnecessary to access compared to other second-order algorithms, that is to say, our method can be applied to the linear models with complex Hessian matrix (e.g. Softmax regression) or ones without Hessian matrix (e.g. L2-SVMs).•We have applied our method to several linear models. The experimental results demonstrate that our proposed algorithm is superior to the first-order algorithms and has a better trade-off between accuracy and efficiency than other methods.

论文关键词:Sparse learning,Newton-type method,Linear models,Quadratic approximation,Iterative hard thresholding

论文评审过程:Received 9 July 2021, Revised 11 December 2021, Accepted 25 January 2022, Available online 3 February 2022, Version of Record 16 February 2022.

论文官网地址:https://doi.org/10.1016/j.patcog.2022.108560