Task-oriented dialogue generation is challenging since the underlying knowledge is often dynamic and effectively incorporating knowledge into the learning process is hard. It is particularly challenging to generate both humanlike and informative responses in this setting. Recent research primarily focused on various knowledge distillation methods where the underlying relationship between the facts in a knowledge base is not effectively captured. In this paper, we go one step further and demonstrate how the structural information of a knowledge graph can improve the system's inference capabilities. Specifically, we propose DialoKG, a novel task-oriented dialogue system that effectively incorporates knowledge into a language model. Our proposed system views relational knowledge as a knowledge graph and introduces (1) a structure-aware knowledge embedding technique, and (2) a knowledge graph-weighted attention masking strategy to facilitate the system selecting relevant information during the dialogue generation. An empirical evaluation demonstrates the effectiveness of DialoKG over state-of-theart methods on several standard benchmark datasets.
CITATION STYLE
Al Hasan Rony, M. R., Usbeck, R., & Lehmann, J. (2022). DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation. In Findings of the Association for Computational Linguistics: NAACL 2022 - Findings (pp. 2557–2571). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-naacl.195
Mendeley helps you to discover research relevant for your work.