Stylized Knowledge-Grounded Dialogue Generation via Disentangled Template Rewriting

8Citations
Citations of this article
42Readers
Mendeley users who have this article in their library.

Abstract

Current Knowledge-Grounded Dialogue Generation (KDG) models specialize in producing rational and factual responses. However, to establish long-term relationships with users, the KDG model needs the capability to generate responses in a desired style or sentiment. Thus, we study a new problem: Stylized Knowledge-Grounded Dialogue Generation (SKDG). It presents two challenges: (1) How to train a SKDG model where no triples are available. (2) How to cohere with context and preserve the knowledge when generating a stylized response. In this paper, we propose a novel disentangled template rewriting (DTR) method which generates responses via combing disentangled style templates (from monolingual stylized corpus) and content templates (from KDG corpus). The entire framework is end-to-end differentiable and learned without supervision. Extensive experiments on two benchmarks indicate that DTR achieves a significant improvement on all evaluation metrics compared with previous state-of-the-art stylized dialogue generation methods. Besides, DTR achieves comparable performance with the state-of-the-art KDG methods in standard KDG evaluation setting.

Cite

CITATION STYLE

APA

Sun, Q., Xu, C., Hu, H., Wang, Y., Miao, J., Geng, X., … Jiang, D. (2022). Stylized Knowledge-Grounded Dialogue Generation via Disentangled Template Rewriting. In NAACL 2022 - 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 3304–3318). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.naacl-main.241

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free