Hierarchical BERT with an adaptive fine-tuning strategy for document classification

作者:

Highlights:

• The hierarchical BERT model consist of local encoder and global encoder.

• An adaptive fine-tuning strategy improve the performance of the PLMs.

• An attention-based gated memory network model global information.

摘要

•The hierarchical BERT model consist of local encoder and global encoder.•An adaptive fine-tuning strategy improve the performance of the PLMs.•An attention-based gated memory network model global information.

论文关键词:Document classification,Hierarchical BERT,Adaptive fine-tuning strategy,Pretrained language model

论文评审过程:Received 30 January 2021, Revised 6 November 2021, Accepted 2 December 2021, Available online 10 December 2021, Version of Record 27 December 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.107872