Comparison of Stochastic Global Optimization Methods to Estimate Neural Network Weights

作者:Lonnie Hamm, B. Wade Brorsen, Martin T. Hagan

摘要

Training a neural network is a difficult optimization problem because of numerous local minima. Many global search algorithms have been used to train neural networks. However, local search algorithms are more efficient with computational resources, and therefore numerous random restarts with a local algorithm may be more effective than a global algorithm. This study uses Monte-Carlo simulations to determine the efficiency of a local search algorithm relative to nine stochastic global algorithms when using a neural network on function approximation problems. The computational requirements of the global algorithms are several times higher than the local algorithm and there is little gain in using the global algorithms to train neural networks. Since the global algorithms only marginally outperform the local algorithm in obtaining a lower local minimum and they require more computational resources, the results in this study indicate that with respect to the specific algorithms and function approximation problems studied, there is little evidence to show that a global algorithm should be used over a more traditional local optimization routine for training neural networks. Further, neural networks should not be estimated from a single set of starting values whether a global or local optimization method is used.

论文关键词:Evolutionary algorithms, Function approximation, Neural networks, Simulated annealing, Stochastic global optimization

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-007-9048-7