Pre-train, Interact, Fine-tune: a novel interaction representation for text classification

作者:

Highlights:

• We propose a novel pipeline for the task of text classification, i.e., Pre-train Interact Fine-tune (PIF).

• To the best of our knowledge, ours is the first attempt to model word interactions for text representation.

• We introduce a two-perspective interaction representation for text classification.

• We propose the Hybrid Language Model Pretrain-finetuning.

• We find that our proposal outperforms the state-of-the-art methods for text classification in terms of accuracy.

摘要

•We propose a novel pipeline for the task of text classification, i.e., Pre-train Interact Fine-tune (PIF).•To the best of our knowledge, ours is the first attempt to model word interactions for text representation.•We introduce a two-perspective interaction representation for text classification.•We propose the Hybrid Language Model Pretrain-finetuning.•We find that our proposal outperforms the state-of-the-art methods for text classification in terms of accuracy.

论文关键词:Interaction representation,Pre-training,Fine-tuning,Classification

论文评审过程:Received 12 October 2019, Revised 3 January 2020, Accepted 26 January 2020, Available online 7 February 2020, Version of Record 20 October 2020.

论文官网地址:https://doi.org/10.1016/j.ipm.2020.102215