EARL: Informative Knowledge-Grounded Conversation Generation with Entity-Agnostic Representation Learning

9Citations
Citations of this article
61Readers
Mendeley users who have this article in their library.

Abstract

Generating informative and appropriate responses is challenging but important for building human-like dialogue systems. Although various knowledge-grounded conversation models have been proposed, these models have limitations in utilizing knowledge that infrequently occurs in the training data, not to mention integrating unseen knowledge into conversation generation. In this paper, we propose an Entity-Agnostic Representation Learning (EARL) method to introduce knowledge graphs to informative conversation generation. Unlike traditional approaches that parameterize the specific representation for each entity, EARL utilizes the context of conversations and the relational structure of knowledge graphs to learn the category representation for entities, which is generalized to incorporating unseen entities in knowledge graphs into conversation generation. Automatic and manual evaluations demonstrate that our model can generate more informative, coherent, and natural responses than baseline models.

Cite

CITATION STYLE

APA

Zhou, H., Huang, M., Liu, Y., Chen, W., & Zhu, X. (2021). EARL: Informative Knowledge-Grounded Conversation Generation with Entity-Agnostic Representation Learning. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 2383–2395). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.184

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free