Improved constrained learning algorithms by incorporating additional functional constraints into neural networks

作者:

Highlights:

摘要

In this paper, two improved constrained learning algorithms that are able to guarantee to obtain better generalization performance are proposed. These two algorithms are substantially on-line learning ones. The cost term for the additional functionality of the first improved algorithm is selected based on the first-order derivatives of the neural activation at hidden layers, while the one of the second improved algorithm is selected based on second-order derivatives of the neural activation at hidden layers and output layer. In the course of training, the cost terms selected from these additional cost functions can penalize the input-to-output mapping sensitivity or high-frequency components included in training data. Finally, theoretical justifications and simulation results are given to verify the efficiency and effectiveness of the two proposed learning algorithms.

论文关键词:Feedforward neural network,On-line constrained learning algorithm, Generalization,Mapping sensitivity,High-frequency components, Time series prediction

论文评审过程:Available online 15 June 2005.

论文官网地址:https://doi.org/10.1016/j.amc.2005.04.073