A Novel Quasi-Newton Method for Composite Convex Minimization

作者:

Highlights:

• A novel parallelizable Jacobi iteration type non-smooth convex composite optimization method is proposed.

• Both first and second order information are utilised while the typical state-of-the-art methods only use first order information.

• The state of art second-order technique, BFGS quasi-Newton method, as well as 3 first-order techniques, steepest gradient descent for and -norms and Nesterov's accelerated gradient descent, are integrated into the proposed method.

• A convergence rate with a lower bound of and superlinear convergence is enjoyable.

• The proposed method converge significantly superior to the state of art methods (e.g. APA-APG1 and APA-APG2, methods which enjoy convergence)

摘要

•A novel parallelizable Jacobi iteration type non-smooth convex composite optimization method is proposed.•Both first and second order information are utilised while the typical state-of-the-art methods only use first order information.•The state of art second-order technique, BFGS quasi-Newton method, as well as 3 first-order techniques, steepest gradient descent for and -norms and Nesterov's accelerated gradient descent, are integrated into the proposed method.•A convergence rate with a lower bound of and superlinear convergence is enjoyable.•The proposed method converge significantly superior to the state of art methods (e.g. APA-APG1 and APA-APG2, methods which enjoy convergence)

论文关键词:non-smooth,proximal mapping,quasi-Newton

论文评审过程:Received 20 June 2020, Revised 21 April 2021, Accepted 28 August 2021, Available online 20 September 2021, Version of Record 10 October 2021.

论文官网地址:https://doi.org/10.1016/j.patcog.2021.108281