A new maximum margin algorithm for one-class problems and its boosting implementation

作者:

Highlights:

摘要

In this paper, each one-class problem is regarded as trying to estimate a function that is positive on a desired slab and negative on the complement. The main advantage of this viewpoint is that the loss function and the expected risk can be defined to ensure that the slab can contain as many samples as possible. Inspired by the nature of SVMs, the intuitive margin is also defined. As a result, a new linear optimization problem to maximize the margin and some theoretically motivated learning algorithms are obtained. Moreover, the proposed algorithms can be implemented by boosting techniques to solve nonlinear one-class classifications.

论文关键词:One-class problems,Outliers,Statistical learning theory,Support vector machines,Margin,Boosting

论文评审过程:Received 30 January 2004, Revised 8 October 2004, Accepted 8 October 2004, Available online 2 March 2005.

论文官网地址:https://doi.org/10.1016/j.patcog.2004.10.010