Knowledge diffusion for neural dialogue generation

149Citations
Citations of this article
266Readers
Mendeley users who have this article in their library.

Abstract

End-to-end neural dialogue generation has shown promising results recently, but it does not employ knowledge to guide the generation and hence tends to generate short, general, and meaningless responses. In this paper, we propose a neural knowledge diffusion (NKD) model to introduce knowledge into dialogue generation. This method can not only match the relevant facts for the input utterance but diffuse them to similar entities. With the help of facts matching and entity diffusion, the neural dialogue generation is augmented with the ability of convergent and divergent thinking over the knowledge base. Our empirical study on a real-world dataset proves that our model is capable of generating meaningful, diverse and natural responses for both factoid-questions and knowledge grounded chi-chats. The experiment results also show that our model outperforms competitive baseline models significantly.

Cite

CITATION STYLE

APA

Liu, S., Chen, H., Ren, Z., Feng, Y., Liu, Q., & Yin, D. (2018). Knowledge diffusion for neural dialogue generation. In ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (Vol. 1, pp. 1489–1498). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p18-1138

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free