Double-bagging: combining classifiers by bootstrap aggregation

作者:

Highlights:

摘要

The combination of classifiers leads to substantial reduction of misclassification error in a wide range of applications and benchmark problems. We suggest using an out-of-bag sample for combining different classifiers. In our setup, a linear discriminant analysis is performed using the observations in the out-of-bag sample, and the corresponding discriminant variables computed for the observations in the bootstrap sample are used as additional predictors for a classification tree. Two classifiers are combined and therefore method and variable selection bias is no problem for the corresponding estimate of misclassification error, the need of an additional test sample disappears. Moreover, the procedure performs comparable to the best classifiers used in a number of artificial examples and applications.

论文关键词:Bagging,Classification,Discriminant analysis,Method selection bias,Error rate estimation

论文评审过程:Received 9 January 2002, Accepted 20 June 2002, Available online 19 December 2002.

论文官网地址:https://doi.org/10.1016/S0031-3203(02)00169-3