Aspect-based sentiment analysis with alternating coattention networks

作者:

Highlights:

摘要

Aspect-based sentiment analysis aims to predict the sentiment polarities of specific targets in a given text. Recent researches show great interest in modeling the target and context with attention network to obtain more effective feature representation for sentiment classification task. However, the use of an average vector of target for computing the attention score for context is unfair. Besides, the interaction mechanism is simple thus need to be further improved. To solve the above problems, this paper first proposes a coattention mechanism which models both target-level and context-level attention alternatively so as to focus on those key words of targets to learn more effective context representation. On this basis, we implement a Coattention-LSTM network which learns nonlinear representations of context and target simultaneously and can extracts more effective sentiment feature from coattention mechanism. Further, a Coattention-MemNet network which adopts a multiple-hops coattention mechanism is proposed to improve the sentiment classification result. Finally, we propose a new location weighted function which considers the location information to enhance the performance of coattention mechanism. Extensive experiments on two public datasets demonstrate the effectiveness of all proposed methods, and our findings in the experiments provide new insight for future developments of using attention mechanism and deep neural network for aspect-based sentiment analysis.

论文关键词:Aspect-based sentiment analysis,Coattention,LSTM,Memory network,Location weighted function,00-01,99-00

论文评审过程:Received 29 July 2018, Revised 2 November 2018, Accepted 3 December 2018, Available online 21 January 2019, Version of Record 21 January 2019.

论文官网地址:https://doi.org/10.1016/j.ipm.2018.12.004