A multi-task transfer learning method with dictionary learning

作者:

Highlights:

摘要

Transfer learning is a problem that samples are generated from more than one domains, which focuses on transferring knowledge from source tasks to target tasks. A variety of methodologies are proposed for transfer learning. And a number of them concentrate on the inner relationship among each domain while some pay more attention to knowledge transfer. In this paper, based on the hinge loss and SVM, a new dictionary learning with multi-task transfer learning method(DMTTL) is proposed. The dictionary learning method is utilized to learn sparse representation of the given samples. Moreover, a regularization term for two dictionaries are exploited so that the similarity of samples can be well determined. Besides, a new optimization method based on alternate convex search is proposed with convergence analysis, which indicates that the DMTTL is a reasonable approach. After that, the comparison of DMTTL with the state-of-the-art approaches manifests the feasibility and the competitive performance for multi-task classification problem. And the statistic results show that the proposed method outperforms the previous methods.

论文关键词:Transfer learning,Dictionary learning,Support vector machine

论文评审过程:Received 16 June 2019, Revised 4 September 2019, Accepted 12 November 2019, Available online 18 November 2019, Version of Record 8 February 2020.

论文官网地址:https://doi.org/10.1016/j.knosys.2019.105233