Hyperparameter selection of one-class support vector machine by self-adaptive data shifting

作者:

Highlights:

• We propose a novel self-adaptive data shifting based method for one-class SVM (OCSVM) hyperparameter selection, which has a significant influence on OCSVM performance.

• The proposed method is able to generates a controllable number of high-quality pseudo outlier data around target data by efficient edge pattern detection and a “negative shifting” mechanism, which can effectively regulate the OCSVM decision boundary for an accurate target data description. Meanwhile, negative shifting soundly addresses two major difficulties of previous pseudo outlier generation based hyperparameter selection methods.

• The proposed method also generates pseudo target data for OCSVM model validation on target class by a “positive shifting” mechanism, which provides an efficient alternative to the time-consuming cross-validation or leave-one-out (LOO) process. More importantly, positive shifting can encourage robustness to noise in the given target data during hyperparameter selection, by generating non-noise pseudo target data for validation from original noise.

• The proposed method is able to yield superior performance when compared with other state-of-the-art OCSVM hyperparameter selection methods, on both synthetic 2-D datasets and various benchmark datasets.

• Unlike many previous methods that introduce additional hyperparameters into OCSVM hyperparameter selection, the proposed method is fully automatic and self-adaptive, leaving no additional hyperparameter for users to tune. Besides, the application of the proposed method is not restricted to certain kernel functions like Gaussian kernel.

摘要

•We propose a novel self-adaptive data shifting based method for one-class SVM (OCSVM) hyperparameter selection, which has a significant influence on OCSVM performance.•The proposed method is able to generates a controllable number of high-quality pseudo outlier data around target data by efficient edge pattern detection and a “negative shifting” mechanism, which can effectively regulate the OCSVM decision boundary for an accurate target data description. Meanwhile, negative shifting soundly addresses two major difficulties of previous pseudo outlier generation based hyperparameter selection methods.•The proposed method also generates pseudo target data for OCSVM model validation on target class by a “positive shifting” mechanism, which provides an efficient alternative to the time-consuming cross-validation or leave-one-out (LOO) process. More importantly, positive shifting can encourage robustness to noise in the given target data during hyperparameter selection, by generating non-noise pseudo target data for validation from original noise.•The proposed method is able to yield superior performance when compared with other state-of-the-art OCSVM hyperparameter selection methods, on both synthetic 2-D datasets and various benchmark datasets.•Unlike many previous methods that introduce additional hyperparameters into OCSVM hyperparameter selection, the proposed method is fully automatic and self-adaptive, leaving no additional hyperparameter for users to tune. Besides, the application of the proposed method is not restricted to certain kernel functions like Gaussian kernel.

论文关键词:One-class SVM,Hyperparameter selection,Data shifting

论文评审过程:Received 13 April 2017, Revised 29 June 2017, Accepted 6 September 2017, Available online 7 September 2017, Version of Record 26 September 2017.

论文官网地址:https://doi.org/10.1016/j.patcog.2017.09.012