Hyperdisk based large margin classifier

作者:

Highlights:

摘要

We introduce a large margin linear binary classification framework that approximates each class with a hyperdisk – the intersection of the affine support and the bounding hypersphere of its training samples in feature space – and then finds the linear classifier that maximizes the margin separating the two hyperdisks. We contrast this with Support Vector Machines (SVMs), which find the maximum-margin separator of the pointwise convex hulls of the training samples, arguing that replacing convex hulls with looser convex class models such as hyperdisks provides safer margin estimates that improve the accuracy on some problems. Both the hyperdisks and their separators are found by solving simple quadratic programs. The method is extended to nonlinear feature spaces using the kernel trick, and multi-class problems are dealt with by combining binary classifiers in the same ways as for SVMs. Experiments on a range of data sets show that the method compares favourably with other popular large margin classifiers.

论文关键词:Large margin classifier,Classification,Convex approximation,Hyperdisk,Kernel method,Support Vector Machine

论文评审过程:Received 23 May 2012, Revised 25 September 2012, Accepted 3 November 2012, Available online 15 November 2012.

论文官网地址:https://doi.org/10.1016/j.patcog.2012.11.004