Long Time No See! Open-Domain Conversation with Long-Term Persona Memory

64Citations
Citations of this article
107Readers
Mendeley users who have this article in their library.

Abstract

Most of the open-domain dialogue models tend to perform poorly in the setting of long-term human-bot conversations. The possible reason is that they lack the capability of understanding and memorizing long-term dialogue history information. To address this issue, we present a novel task of Long-term Memory Conversation (LeMon) and then build a new dialogue dataset DuLeMon and a dialogue generation framework PLATO-LTM with a Long-Term Memory (LTM) mechanism. This LTM mechanism enables our system to accurately extract and continuously update long-term persona memory without requiring multiple-session dialogue datasets for model training. To our knowledge, this is the first attempt to conduct real-time dynamic management of persona information of both parties, including the user and the bot. Results on DuLeMon indicate that PLATO-LTM can significantly outperform baselines in terms of long-term dialogue consistency, leading to better dialogue engagingness.

Cite

CITATION STYLE

APA

Xu, X., Gou, Z., Wu, W., Niu, Z. Y., Wu, H., Wang, H., & Wang, S. (2022). Long Time No See! Open-Domain Conversation with Long-Term Persona Memory. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 2639–2650). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-acl.207

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free