Hyper-parameter optimization in classification: To-do or not-to-do

作者:

Highlights:

• We found hyper-param tuning is not well justified in many cases but still very useful in a few.

• We propose a framework to address the problem of deciding to-tune or not-to-tune.

• We implemented a prototype of the framework with 486 datasets and 4 algorithm.

• The results indicates our framework is effective at avoiding effects of ineffective tuning.

• Our framework enables a life-long learning approach to the problem.

摘要

•We found hyper-param tuning is not well justified in many cases but still very useful in a few.•We propose a framework to address the problem of deciding to-tune or not-to-tune.•We implemented a prototype of the framework with 486 datasets and 4 algorithm.•The results indicates our framework is effective at avoiding effects of ineffective tuning.•Our framework enables a life-long learning approach to the problem.

论文关键词:Hyper-parameter optimization,Framework,Bayesian optimization,Machine learning,Incremental learning

论文评审过程:Received 25 June 2019, Revised 19 October 2019, Accepted 25 January 2020, Available online 31 January 2020, Version of Record 25 February 2020.

论文官网地址:https://doi.org/10.1016/j.patcog.2020.107245