History-Aware Hierarchical Transformer for Multi-session Open-domain Dialogue System

8Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With the evolution of pre-trained language models, current open-domain dialogue systems have achieved great progress in conducting one-session conversations. In contrast, Multi-Session Conversation (MSC), which consists of multiple sessions over a long term with the same user, is under-investigated. In this paper, we propose History-Aware Hierarchical Transformer (HAHT) for multi-session open-domain dialogue. HAHT maintains a long-term memory of history conversations and utilizes history information to understand current conversation context and generate well-informed and context-relevant responses. Specifically, HAHT first encodes history conversation sessions hierarchically into a history memory. Then, HAHT leverages historical information to facilitate the understanding of the current conversation context by encoding the history memory together with the current context with attention-based mechanisms. Finally, to explicitly utilize historical information, HAHT uses a history-aware response generator that switches between a generic vocabulary and a history-aware vocabulary. Experimental results on a large-scale MSC dataset suggest that the proposed HAHT model consistently outperforms baseline models. Human evaluation results support that HAHT generates more human-like, context-relevant and history-relevant responses than baseline models.

Cite

CITATION STYLE

APA

Zhang, T., Liu, Y., Li, B., Zeng, Z., Wang, P., You, Y., … Cui, L. (2022). History-Aware Hierarchical Transformer for Multi-session Open-domain Dialogue System. In Findings of the Association for Computational Linguistics: EMNLP 2022 (pp. 3395–3407). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-emnlp.10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free