Contextual sentiment embeddings via bi-directional GRU language model
作者:
Highlights:
•
摘要
Compared with conventional word embeddings, sentiment embeddings can distinguish words with similar contexts but opposite sentiment. They can be used to incorporate sentiment information from labeled corpora or lexicons by either end-to-end training or sentiment refinement. However, these methods present two major limitations. First, traditional approaches provide a fixed representation to each word but ignore the alternation of word meaning in different contexts. As a result, the polarity of a certain emotional word may vary with context, but will be assigned with a same representation. Another problem is the handling of out-of-vocabulary (OOV) or informal-writing sentiment words that would be assigned generic vectors (e.g.,
论文关键词:Contextual sentiment embeddings,Sentiment analysis,Pre-trained language model,Gated recurrent unit
论文评审过程:Received 11 December 2020, Revised 28 August 2021, Accepted 28 October 2021, Available online 1 November 2021, Version of Record 11 November 2021.
论文官网地址:https://doi.org/10.1016/j.knosys.2021.107663