Hybrid multi-document summarization using pre-trained language models

作者:

Highlights:

• Introducing a multi-document summarizer, called HMSumm, based on pre-trained methods.

• Employing both extractive and abstractive methods to generate the final summary.

• Comparing the performance with the state-of-the-art single/multi-document summarizers.

摘要

•Introducing a multi-document summarizer, called HMSumm, based on pre-trained methods.•Employing both extractive and abstractive methods to generate the final summary.•Comparing the performance with the state-of-the-art single/multi-document summarizers.

论文关键词:Pre-trained language models,Extractive summarization,Abstractive summarization,Determinantal point process,Deep submodular network

论文评审过程:Received 18 September 2020, Revised 13 November 2021, Accepted 23 November 2021, Available online 18 December 2021, Version of Record 22 December 2021.

论文官网地址:https://doi.org/10.1016/j.eswa.2021.116292