Technical efficiency-based selection of learning cases to improve forecasting accuracy of neural networks under monotonicity assumption

作者:

Highlights:

摘要

In this paper, we show that when an artificial neural network (ANN) model is used for learning monotonic forecasting functions, it may be useful to screen training data so the screened examples approximately satisfy the monotonicity property. We show how a technical efficiency-based ranking, using the data envelopment analysis (DEA) model, and a predetermined threshold efficiency, might be useful to screen training data so that a subset of examples that approximately satisfy the monotonicity property can be identified. Using a health care forecasting problem, the monotonicity assumption, and a predetermined threshold efficiency level, we use DEA to split training data into two mutually exclusive, “efficient” and “inefficient”, training data subsets. We compare the performance of the ANN by using the “efficient” and “inefficient” training data subsets. Our results indicate that the predictive performance of an ANN that is trained on the “efficient” training data subset is higher than the predictive performance of an ANN that is trained on the “inefficient” training data subset.

论文关键词:Data envelopment analysis,Connectionist models/artificial neural networks,Human resource management

论文评审过程:Accepted 3 July 2002, Available online 28 August 2002.

论文官网地址:https://doi.org/10.1016/S0167-9236(02)00138-0