Approximate information discriminant analysis: A computationally simple heteroscedastic feature extraction technique
作者:
Highlights:
•
摘要
In this article we develop a novel linear dimensionality reduction technique for classification. The technique utilizes the first two statistical moments of data and retains the computational simplicity, characteristic of second-order techniques, such as linear discriminant analysis. Formally, the technique maximizes a criterion that belongs to the class of probability dependence measures, and is naturally defined for multiple classes. The criterion is based on an approximation of an information-theoretic measure and is capable of handling heteroscedastic data. The performance of our method, along with similar feature extraction approaches, is demonstrated based on experimental results with real-world datasets. Our method compares favorably to similar second-order linear dimensionality techniques.
论文关键词:Feature extraction,Information theory,Mutual information,Entropy,Classification,Linear discriminant analysis,Bayes error
论文评审过程:Received 11 June 2007, Revised 26 September 2007, Accepted 4 October 2007, Available online 10 October 2007.
论文官网地址:https://doi.org/10.1016/j.patcog.2007.10.001