Unsupervised Knowledge Selection for Dialogue Generation

15Citations
Citations of this article
71Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Knowledge selection is an important and challenging task which could provide the appropriate knowledge for informative dialogue generation. However, the needed gold knowledge label is difficult to collect in reality. In this paper, we study knowledge selection for dialogue generation in the unsupervised scenario and propose a novel Distilled Distant Supervision Loss (DDSL) to supervise knowledge selection when the gold knowledge label is unknown. Specifically, we first obtain an oracle knowledge label via distant supervision and then leverage knowledge distillation to alleviate the noisy labeling problem of distant supervision. Furthermore, we propose a pretraining-finetuning strategy to deal with the mismatch knowledge selection problem that models tend to select the mismatched knowledge for dialogue generation in the unsupervised setting and will cause the degeneration of knowledge-aware decoder. Experiments on two knowledge-grounded dialogue datasets show that our approach manages to select knowledge more accurately in the unsupervised setting and generates more informative responses, even outperforming many strong supervised baselines.

Cite

CITATION STYLE

APA

Chen, X., Chen, F., Meng, F., Li, P., & Zhou, J. (2021). Unsupervised Knowledge Selection for Dialogue Generation. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 1230–1244). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.105

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free