Some aspects of error bounds in feature selection

作者:

Highlights:

摘要

In this paper we discuss various bounds on the Bayesian probability of error, which are used for feature selection, and are based on distance measures and information measures. We show that they are basically of two types. One type can be related to the f-divergence, the other can be related to information measures. This also clarifies some properties of these measures for the two-class problem and for the multiclass problem. We give some general bounds on the Bayesian probability of error and discuss various aspects of the different approaches.

论文关键词:Information measures,Distance measures,Feature evaluation,f-divergence,Error bounds

论文评审过程:Received 3 May 1978, Revised 6 February 1979, Available online 19 May 2003.

论文官网地址:https://doi.org/10.1016/0031-3203(79)90047-5