Knowledge Augmented Dialogue Generation with Divergent Facts Selection

作者:

Highlights:

摘要

The end-to-end open-domain dialogue system is a challenging task since the existing neural models suffer from the issue of trivial responses. Employing background knowledge as a major solution, has been proven to be effective to improve the responses quality. However, less attention was paid to the selection of the appropriate knowledge in scenarios where the utterance subject drifts between two partners, which could prohibit the model from learning to access knowledge correctly. In this paper, we propose a novel Knowledge Augmented Dialogue Generation (KADG) model to facilitate both knowledge selection and incorporation in open-domain dialogue systems. The core components of KADG consist of Divergent Knowledge Selector (DKS) and Knowledge Aware Decoder (KAD). DKS performs a one-hop subject reasoning over knowledge by pre-optimizing each knowledge candidate with inferred drift clue. Drift clue implies the potential subjects association of the current conversation and is served to bridge the subject gap in the knowledge selection. Thereafter, KAD makes full use of this selected knowledge to generate responses contextual coherently as well as knowledgeably. Comprehensive experiments on a newly released knowledge-grounded conversation dataset Wizard-of-Wikipedia have verified the superiority of our model than previous baselines and shown that our method can refer to the knowledge properly and generate diverse and informative responses.

论文关键词:Open-domain dialogue systems,Knowledge selection,Subject drift,Attention mechanism

论文评审过程:Received 8 April 2020, Revised 15 September 2020, Accepted 18 September 2020, Available online 29 September 2020, Version of Record 5 October 2020.

论文官网地址:https://doi.org/10.1016/j.knosys.2020.106479