Focused multi-task learning in a Gaussian process framework

作者:Gayle Leen, Jaakko Peltonen, Samuel Kaski

摘要

Multi-task learning, learning of a set of tasks together, can improve performance in the individual learning tasks. Gaussian process models have been applied to learning a set of tasks on different data sets, by constructing joint priors for functions underlying the tasks. In these previous Gaussian process models, the setting has been symmetric in the sense that all the tasks have been assumed to be equally important, whereas in settings such as transfer learning the goal is asymmetric, to enhance performance in a target task given the other tasks. We propose a focused Gaussian process model which introduces an “explaining away” model for each of the additional tasks to model their non-related variation, in order to focus the transfer to the task-of-interest. This focusing helps reduce the key problem of negative transfer, which may cause performance to even decrease if the tasks are not related closely enough. In experiments, our model improves performance compared to single-task learning, symmetric multi-task learning using hierarchical Dirichlet processes, transfer learning based on predictive structure learning, and symmetric multi-task learning with Gaussian processes.

论文关键词:Gaussian processes, Multi-task learning, Transfer learning, Negative transfer

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10994-012-5302-y