Attention-based skill translation models for expert finding

作者:

Highlights:

• Proposing two attention-based neural networks for the task of skill translation.

• Using skill translations to fill the semantic gap between skill and candidates.

• Utilizing context-aware information to find semantically related translations for skills.

• Achieving the best performance even with a few translations.

• Usability of the proposed networks as multi-label document classifiers.

摘要

•Proposing two attention-based neural networks for the task of skill translation.•Using skill translations to fill the semantic gap between skill and candidates.•Utilizing context-aware information to find semantically related translations for skills.•Achieving the best performance even with a few translations.•Usability of the proposed networks as multi-label document classifiers.

论文关键词:Expert finding,Semantic matching,Translation models,StackOverflow

论文评审过程:Received 11 February 2021, Revised 20 November 2021, Accepted 19 December 2021, Available online 11 January 2022, Version of Record 19 January 2022.

论文官网地址:https://doi.org/10.1016/j.eswa.2021.116433