A distribution-free geometric upper bound for the probability of error of a minimum distance classifier

作者:

Highlights:

摘要

An upperbound to the probability of error per class in a multivariate pattern classification is derived. The bound, given by P(E|class wi)≤NR2i is derived with minimal assumptions; specifically the mean vectors exist and are distinct and the covariance matrices exist and are non-singular. No other assumptions are made about the nature of the distributions of the classes. In equation (i) N is the number of features in the feature (vector) space and Ri is a measure of the “radial neighbourhood” of a class. An expression for Ri is developed. A comparison to the multivariate Gaussian hypothesis is presented.

论文关键词:Distance classifiers,Error bounds,Multivariate distributions,Feature evaluation,Non-parametric statistics,Chebyshev's inequality

论文评审过程:Received 29 August 1977, Available online 19 May 2003.

论文官网地址:https://doi.org/10.1016/0031-3203(78)90037-7