An incremental decomposition method for unconstrained optimization

作者:

Highlights:

摘要

In this work we consider the problem of minimizing a sum of continuously differentiable functions. The vector of variables is partitioned into two blocks, and we assume that the objective function is convex with respect to a block-component. Problems with this structure arise, for instance, in machine learning. In order to advantageously exploit the structure of the objective function and to take into account that the number of terms of the objective function may be huge, we propose a decomposition algorithm combined with a gradient incremental strategy. Global convergence of the proposed algorithm is proved. The results of computational experiments performed on large-scale real problems show the effectiveness of the proposed approach with respect to existing algorithms.

论文关键词:Large-scale unconstrained optimization,Decomposition,Gradient incremental methods

论文评审过程:Available online 22 March 2014.

论文官网地址:https://doi.org/10.1016/j.amc.2014.02.088