Truncated partitioning group correction algorithms for large-scale sparse unconstrained optimization

作者:

Highlights:

摘要

This paper gives approaches to solving large-scale sparse unconstrained optimization based on a successive partitioning group correction algorithm. In large-scale optimization, solving the Newton-like equations at each iteration can be expensive and may not be justified when far from a solution. Instead, an inaccurate solution to the Newton-like equations is computed using a conjugate gradient method. Besides, the methods also depend on a symmetric consistent partition of the columns of the Hessian matrix. A q-superlinear convergence result and an r-convergence rate estimate show that the methods have good local convergence properties. Global convergence is proven and the numerical results show that the methods may be competitive with some current used algorithms.

论文关键词:Unconstrained optimization,Truncated Newton-like methods,Inexact,Sparsity,Partition,Conjugate gradient algorithms

论文评审过程:Available online 19 January 2007.

论文官网地址:https://doi.org/10.1016/j.amc.2007.01.023