Attention-based long short-term memory network using sentiment lexicon embedding for aspect-level sentiment analysis in Korean

作者:

Highlights:

• Current word embedding methods have several limitations despite of their usefulness.

• This paper proposes sentiment lexicon embedding method to mitigate the limitation.

• The proposed embedding does not require external sentiment lexicon resources.

• The proposed method can represent better sentiment word's semantic relation.

• The proposed approach improves the performance of aspect-level sentiment analysis.

摘要

•Current word embedding methods have several limitations despite of their usefulness.•This paper proposes sentiment lexicon embedding method to mitigate the limitation.•The proposed embedding does not require external sentiment lexicon resources.•The proposed method can represent better sentiment word's semantic relation.•The proposed approach improves the performance of aspect-level sentiment analysis.

论文关键词:Attention mechanism,Embedding learning,LSTM,Sentiment analysis

论文评审过程:Received 12 July 2018, Revised 6 November 2018, Accepted 6 December 2018, Available online 9 January 2019, Version of Record 9 January 2019.

论文官网地址:https://doi.org/10.1016/j.ipm.2018.12.005