Abstract
Human conversations naturally evolve around related concepts and scatter to multi-hop concepts. This paper presents a new conversation generation model, ConceptFlow, which leverages commonsense knowledge graphs to explicitly model conversation flows. By grounding conversations to the concept space, ConceptFlow represents the potential conversation flow as traverses in the concept space along commonsense relations. The traverse is guided by graph attentions in the concept graph, moving towards more meaningful directions in the concept space, in order to generate more semantic and informative responses. Experiments on Reddit conversations demonstrate ConceptFlow's effectiveness over previous knowledge-aware conversation models and GPT-2 based models while using 70% fewer parameters, confirming the advantage of explicit modeling conversation structures. All source codes of this work are available at https://github.com/thunlp/ConceptFlow.
Cite
CITATION STYLE
Zhang, H., Liu, Z., Xiong, C., & Liu, Z. (2020). Grounded conversation generation as guided traverses in commonsense knowledge graphs. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 2031–2043). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.184
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.