Efficient leave-one-out cross-validation of kernel fisher discriminant classifiers

作者:

Highlights:

摘要

Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–48) apply the “kernel trick” to obtain a non-linear variant of Fisher's linear discriminant analysis method, demonstrating state-of-the-art performance on a range of benchmark data sets. We show that leave-one-out cross-validation of kernel Fisher discriminant classifiers can be implemented with a computational complexity of only O(ℓ3) operations rather than the O(ℓ4) of a naı̈ve implementation, where ℓ is the number of training patterns. Leave-one-out cross-validation then becomes an attractive means of model selection in large-scale applications of kernel Fisher discriminant analysis, being significantly faster than conventional k-fold cross-validation procedures commonly used.

论文关键词:Model selection,Cross-validation,Kernel Fisher discriminant analysis

论文评审过程:Received 9 September 2002, Accepted 16 April 2003, Available online 12 July 2003.

论文官网地址:https://doi.org/10.1016/S0031-3203(03)00136-5