A flexible transfer learning framework for Bayesian optimization with convergence guarantee

作者:

Highlights:

• A transfer learning method to address the cold start in Bayesian optimization.

• Proposed method can benefit from tasks of varying relatedness.

• Derivation of theoretical guarantees on convergence for the method.

• Demonstration on tuning the hyperparameters of machine learning algorithms.

摘要

•A transfer learning method to address the cold start in Bayesian optimization.•Proposed method can benefit from tasks of varying relatedness.•Derivation of theoretical guarantees on convergence for the method.•Demonstration on tuning the hyperparameters of machine learning algorithms.

论文关键词:Bayesian optimization,Transfer learning,Gaussian process

论文评审过程:Received 6 May 2018, Revised 12 August 2018, Accepted 13 August 2018, Available online 13 August 2018, Version of Record 11 September 2018.

论文官网地址:https://doi.org/10.1016/j.eswa.2018.08.023