Deep multi-task learning with relational attention for business success prediction

作者:

Highlights:

摘要

Multi-task learning is a promising machine learning branch, which aims to improve the generalization of the prediction models by sharing knowledge among tasks. Most of the existing multi-task learning methods rely on predefined task relationships and guide the learning process of models by linear regularization terms. On the one hand, improper setting of task relationships may result in negative knowledge transfer; on the other hand, these methods also suffer from the insufficiency of representation ability. To overcome these problems, this paper focuses on attention-based deep multi-task learning method, and provides a novel deep multi-task learning method, namely, Deep Multi-task Learning with Relational Attention (DMLRA). In particular, we first provide a task-specific attention module to specify features for different learning tasks, because different prediction tasks may rely on different parts of the shared feature set. Then, we design a relational attention module to learn relationships among multiple tasks automatically, and transfer positive and negative knowledge among multiple tasks accordingly. Moreover, we provide a joint deep multi-task learning framework to combine task-specific module and relational attention module. Finally, we apply our method on a multi-criteria business success assessment problem, both classical and the state-of-the-art multi-task learning methods are employed to provide baseline performance. The experiments are conducted on real-world datasets, results demonstrate the superiority of our method over the existing methods.

论文关键词:Multi-task learning,Attention,Site selection

论文评审过程:Received 16 July 2019, Revised 14 March 2020, Accepted 21 May 2020, Available online 20 June 2020, Version of Record 1 November 2020.

论文官网地址:https://doi.org/10.1016/j.patcog.2020.107469