An Accelerated Double Step Size model in unconstrained optimization

作者:

Highlights:

摘要

This work presents a double step size algorithm with accelerated property for solving nonlinear unconstrained optimization problems. Using the inexact line search technique, as well as the approximation of the Hessian by an adequate diagonal matrix, an efficient accelerated gradient descent method is developed. The proposed method is proven to be linearly convergent for uniformly convex functions and also, under some specific conditions, linearly convergent for strictly convex quadratic functions. Numerical testings and comparisons show that constructed scheme exceeds some known iterations for unconstrained optimization with respect to all three tested properties: number of iterations, CPU time and number of function evaluations.

论文关键词:Line search,Gradient descent methods,Newton method,Convergence rate

论文评审过程:Available online 18 November 2014.

论文官网地址:https://doi.org/10.1016/j.amc.2014.10.104