Approximation of response knowledge retrieval in knowledge-grounded dialogue generation

8Citations
Citations of this article
64Readers
Mendeley users who have this article in their library.

Abstract

This paper is concerned with improving dialogue generation models through injection of knowledge, e.g., content relevant to the post that can increase the quality of responses. Past research extends the training of the generative models by incorporating statistical properties of posts, responses and related knowledge, without explicitly assessing the knowledge quality. In our work, we demonstrate the importance of knowledge relevance and adopt a two-phase approach. We first apply a novel method, Transformer & Post based Posterior Approximation (TPPA) to select knowledge, and then use the Transformer with Expanded Decoder (TED) model to generate responses from both the post and the knowledge. TPPA method processes posts, post related knowledge, and response related knowledge at both word and sentence level. Our experiments with the TED generative model demonstrate the effectiveness of TPPA as it outperforms a set of strong baseline models. Our TPPA method is extendable and supports further optimization of knowledge retrieval and injection.

Cite

CITATION STYLE

APA

Zheng, W., Milic-Frayling, N., & Zhou, K. (2020). Approximation of response knowledge retrieval in knowledge-grounded dialogue generation. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 3581–3591). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.321

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free