Multiclass LS-SVMs: Moderated Outputs and Coding-Decoding Schemes

作者:T. van Gestel, J. A. K. Suykens, G. Lanckriet, A. Lambrechts, B. de Moor, J. Vandewalle

摘要

A common way of solving the multiclass categorization problem is to reformulate the problem into a set of binary classification problems. Discriminative binary classifiers like, e.g., Support Vector Machines (SVMs), directly optimize the decision boundary with respect to a certain cost function. In a pragmatic and computationally simple approach, Least Squares SVMs (LS-SVMs) are inferred by minimizing a related regression least squares cost function. The moderated outputs of the binary classifiers are obtained in a second step within the evidence framework. In this paper, Bayes' rule is repeatedly applied to infer the posterior multiclass probabilities, using the moderated outputs of the binary plug-in classifiers and the prior multiclass probabilities. This Bayesian decoding motivates the use of loss function based decoding instead of Hamming decoding. For SVMs and LS-SVMs with linear kernel, experimental evidence suggests the use of one-versus-one coding. With a Radial Basis Function kernel one-versus-one and error correcting output codes yield the best performances, but simpler codings may still yield satisfactory results.

论文关键词:Bayesian decoding, discriminant analysis, evidence framework, Hamming decoding, multiclass classification, regression, Support Vector Machines

论文评审过程:

论文官网地址:https://doi.org/10.1023/A:1013815310229