Contextualize Knowledge Bases with Transformer for End-to-end Task-Oriented Dialogue Systems

11Citations
Citations of this article
80Readers
Mendeley users who have this article in their library.

Abstract

Incorporating knowledge bases (KB) into end-to-end task-oriented dialogue systems is challenging, since it requires to properly represent the entity of KB, which is associated with its KB context and dialogue context. The existing works represent the entity with only perceiving a part of its KB context, which can lead to the less effective representation due to the information loss, and adversely favor KB reasoning and response generation. To tackle this issue, we explore to fully contextualize the entity representation by dynamically perceiving all the relevant entities and dialogue history. To achieve this, we propose a COntext-aware Memory Enhanced Transformer framework (COMET), which treats the KB as a sequence and leverages a novel Memory Mask to enforce the entity to only focus on its relevant entities and dialogue history, while avoiding the distraction from the irrelevant entities. Through extensive experiments, we show that our COMET framework can achieve superior performance over the state of the arts.

Cite

CITATION STYLE

APA

Gou, Y., Lei, Y., Liu, L., Dai, Y., & Shen, C. (2021). Contextualize Knowledge Bases with Transformer for End-to-end Task-Oriented Dialogue Systems. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 4300–4310). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.353

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free