Imprecise prior knowledge incorporating into one-class classification
作者:Lev V. Utkin, Yulia A. Zhuk
摘要
An extension of Campbell and Bennett’s novelty detection or one-class classification model incorporating prior knowledge is studied in the paper. The proposed extension relaxes the strong assumption of the empirical probability distribution over elements of a training set and deals with a set of probability distributions produced by prior knowledge about training data. The classification problem is solved by considering extreme points of the probability distribution set or by means of the conjugate duality technique. Special cases of prior knowledge are considered in detail, including the imprecise linear-vacuous mixture model and interval-valued moments of feature values. Numerical experiments show that the proposed models outperform Campbell and Bennett’s model for many real and synthetic data.
论文关键词:Machine learning, One-class classification, Minimax strategy, Novelty detection, Linear programming, Imprecise statistical model, Extreme points
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10115-013-0661-7