Fine-grained citation count prediction via a transformer-based model with among-attention mechanism

作者:

Highlights:

• This paper proposed a fine-grained citation count prediction task (FGCCP), which predicts in-text citation count from each structural function of a paper separately.

• This paper proposed a transformer-based model (i.e. MTAT) in which a novel among-attention mechanism is employed.

• Our empirical result confirmed that MTAT achieves satisfactory prediction accuracy, and surpasses common machine learning and deep learning models on FGCCP.

• The among-attention mechanism effectively grasps the internal relationship among multi-tasks. MTAT can be easily deployed in other multi-task learning jobs.

摘要

•This paper proposed a fine-grained citation count prediction task (FGCCP), which predicts in-text citation count from each structural function of a paper separately.•This paper proposed a transformer-based model (i.e. MTAT) in which a novel among-attention mechanism is employed.•Our empirical result confirmed that MTAT achieves satisfactory prediction accuracy, and surpasses common machine learning and deep learning models on FGCCP.•The among-attention mechanism effectively grasps the internal relationship among multi-tasks. MTAT can be easily deployed in other multi-task learning jobs.

论文关键词:Citation count prediction,Functional structure,Neural network,Content-based citation analysis

论文评审过程:Received 25 July 2021, Revised 14 October 2021, Accepted 17 October 2021, Available online 9 November 2021, Version of Record 9 November 2021.

论文官网地址:https://doi.org/10.1016/j.ipm.2021.102799