New validation methods for improving standard and multi-parametric support vector regression training time

作者:

Highlights:

摘要

The selection of hyper-parameters in support vector regression algorithms (SVMr) is an essential process in the training of these learning machines. Unfortunately, there is not an exact method to obtain the optimal values of SVMr hyper-parameters. Therefore, it is necessary to use a search algorithm and sometimes a validation method in order to find the best combination of hyper-parameters. The problem is that the SVMr training time can be huge in large training databases if standard search algorithms and validation methods (such as grid search and K-fold cross validation), are used. In this paper we propose two novel validation methods which reduce the SVMr training time, maintaining the accuracy of the final machine. We show the good performance of both methods in the standard SVMr with 3 hyper-parameters (where the hyper-parameters search is usually carried out by means of a grid search) and also in the extension to multi-parametric kernels, where meta-heuristic approaches such as evolutionary algorithms must be used to look for the best set of SVMr hyper-parameters. In all cases the new validation methods have provided very good results in terms of training time, without affecting the final SVMr accuracy.

论文关键词:Support vector regression algorithms,SVMr hyper-parameters,Training time,Validation methods

论文评审过程:Available online 4 February 2012.

论文官网地址:https://doi.org/10.1016/j.eswa.2012.01.142