Least desirable feature elimination in a general pattern recognition problem

作者:

Highlights:

摘要

A new technique for dimensionality reduction for a general pattern recognition problem with pattern classes represented by multivariate normal distributions is presented. The method consists of identifying and eliminating the least desirable feature out of the original feature space. It is shown that great simplicity is obtained by doing so in comparison to the existing methods. A very simple expression describing a vector representing the least desirable feature in a most general case is derived. If the original feature space is N-dimensional then recognizing and eliminating such a feature is equivalent to selecting and retaining the N−1 best features. J-divergence is used as a measure of the discrimination between the classes. The flow chart for the method for discarding more than one dimension is presented.

论文关键词:Feature selection,Pattern classification,Data compression,Dimensionality reduction,Discriminant function

论文评审过程:Received 1 May 1986, Revised 12 September 1986, Available online 19 May 2003.

论文官网地址:https://doi.org/10.1016/0031-3203(87)90010-0