New quasi-Newton methods for unconstrained optimization problems

作者:

Highlights:

摘要

Many methods for solving minimization problems are variants of Newton method, which requires the specification of the Hessian matrix of second derivatives. Quasi-Newton methods are intended for the situation where the Hessian is expensive or difficult to calculate. Quasi-Newton methods use only first derivatives to build an approximate Hessian over a number of iterations. This approximation is updated each iteration by a matrix of low rank. In unconstrained minimization, the original quasi-Newton equation is Bk+1sk = yk, where yk is the difference of the gradients at the last two iterates. In this paper, we first propose a new quasi-Newton equation Bk+1sk=yk∗ in which yk∗ is decided by the sum of yk and Aksk where Ak is some matrix. Then we give two choices of Ak which carry some second order information from the Hessian of the objective function. The three corresponding BFGS-TYPE algorithms are proved to possess global convergence property. The superlinear convergence of the one algorithm is proved. Extensive numerical experiments have been conducted which show that the proposed algorithms are very encouraging.

论文关键词:Unconstrained optimization,Quasi-Newton method,Quasi-Newton equation,Global convergence,Superlinear convergence

论文评审过程:Available online 21 October 2005.

论文官网地址:https://doi.org/10.1016/j.amc.2005.08.027