Randomizing Outputs to Increase Prediction Accuracy

作者:Leo Breiman

摘要

Bagging and boosting reduce error by changing both the inputs and outputs to form perturbed training sets, growing predictors on these perturbed training sets and combining them. An interesting question is whether it is possible to get comparable performance by perturbing the outputs alone. Two methods of randomizing outputs are experimented with. One is called output smearing and the other output flipping. Both are shown to consistently do better than bagging.

论文关键词:ensemble, randomization, output variability

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1007682208299