A regularized root–quartic mixture of experts for complex classification problems

作者:

Highlights:

摘要

Mixture of experts is a neural network based ensemble learning approach consisting of several experts and a gating network. In this paper, we introduce regularized root–quartic mixture of experts (R-RTQRT-ME) by incorporating a regularization term into the error function to control the complexity of model and to increase robustness in confronting with over-fitting and noise. The average of the results of R-RTQRT-ME on 20 classification benchmark datasets, shows that this algorithm performs 1.75%, 2.50%, 2.29% better than multi objective regularized negative correlation learning, multi objective negative correlation learning and multi objective neural network, respectively. Also, the average of improvements of R-RTQRT-ME is 1.16%, 2.31%, 3.40%, 3.39% in comparison with root-quartic mixture of experts, mixture of negatively correlated experts, mixture of experts and negative correlation learning, respectively. Furthermore, the effect of the regularization penalty term in R-RTQRT-ME on noisy data is analyzed which shows the robustness of R-RTQRT-ME in these situations.

论文关键词:Mixture of experts,Negative correlation learning,Ensemble learning,Diversity,Generalization ability,Regularization

论文评审过程:Received 29 September 2015, Revised 9 July 2016, Accepted 11 July 2016, Available online 12 July 2016, Version of Record 29 September 2016.

论文官网地址:https://doi.org/10.1016/j.knosys.2016.07.018