Sentiment Lossless Summarization

作者:

Highlights:

摘要

The aim of automatic text summarization (ATS) is to extract representative texts from documents and keep major points of the extracted texts consistent with the original documents. However, most existing studies ignore sentimental information loss in the summarization process, which leads to sentiment loss summarization. To address the sentiment loss issue during summarization, we introduce a sentiment compensation mechanism into document summarization and propose a graph-based extractive summarization approach named Sentiment Lossless Summarization (SLS). SLS first creates a graph representation for a document to obtain the importance score (i.e., literal indicator) of each sentence. Second, sentiment dictionaries are leveraged to analyze the sentence sentiments. Third, during each summarization iteration, the sentences with the lowest scores are iteratively removed, and the sentiment compensation weights of the remaining sentences are updated. With the help of sentiment compensation during the summarization process, sentiment consistencies between candidate summaries and the original documents are maintained. Intrinsic evaluations conducted on the DUC2001, DUC2002, DUC2004, and Multi-News datasets demonstrate that our approach outperforms baselines and state-of-the-art summarization methods in terms of Recall-Oriented Understudy for Gisting Evaluation (ROUGE) scores. Additionally, to further evaluate SLS performance in sentiment retention, extrinsic evaluations are introduced, and summary quality in terms of sentiment loss is evaluated by measuring the prediction accuracy for sentiment polarities of either movie (IMDb dataset) or product (Amazon dataset) review summaries. The experimental results demonstrate that our approach can improve prediction accuracy by at most 6% compared to the baseline.

论文关键词:00-01,99-00,Graph-based summarization,Extractive summarization,Sentiment analysis

论文评审过程:Received 25 December 2020, Revised 12 May 2021, Accepted 22 May 2021, Available online 24 May 2021, Version of Record 29 May 2021.

论文官网地址:https://doi.org/10.1016/j.knosys.2021.107170