CGRS — An advanced hybrid method for global optimization of continuous functions closely coupling extended random search and conjugate gradient method

作者:

Highlights:

摘要

A new hybrid method for global optimization of continuous functions is proposed. It is a combination of an extended random search method and a descent method. Random search is used as the global search strategy. A newly developed distribution-based region control makes use of already detected local minima to refine this search strategy. The approach resembles classical step size control in deterministic optimization. The descent method is embedded as a local search strategy for the detection of local minima. A special realization of this approach is presented in this paper and called CGRS. In CGRS the conjugate gradient method is utilized as descent method. The proof of global convergence in probability for CGRS is given and extended to other descent methods used in the hybrid optimization approach. In order to demonstrate the numerical properties of the approach test sets of multidimensional non-convex optimization problems are solved. The results are compared to well-established hybrid methods for global optimization. The new algorithm shows a high success rate with good and adjustable solution precision. Parameter tuning is not necessary, but of course possible. The new method proves to be efficient in terms of computational costs.

论文关键词:Global optimization,Random search,Conjugate gradient method,Hybrid approach,Convergence in probability,Distribution-based region control

论文评审过程:Received 7 December 2016, Revised 5 October 2017, Accepted 9 October 2017, Available online 31 October 2017, Version of Record 21 November 2017.

论文官网地址:https://doi.org/10.1016/j.cam.2017.10.018