A Unified Encoder-Decoder Framework with Entity Memory

8Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.

Abstract

Entities, as important carriers of real-world knowledge, play a key role in many NLP tasks. We focus on incorporating entity knowledge into an encoder-decoder framework for informative text generation. Existing approaches tried to index, retrieve, and read external documents as evidence, but they suffered from a large computational overhead. In this work, we propose an Encoder-Decoder framework with an entity Memory, namely EDMem. The entity knowledge is stored in the memory as latent representations, and the memory is pre-trained on Wikipedia along with encoder-decoder parameters. To precisely generate entity names, we design three decoding methods to constrain entity generation by linking entities in the memory. EDMem is a unified framework that can be used on various entity-intensive question answering and generation tasks. Extensive experimental results show that EDMem outperforms both memory-based auto-encoder models and non-memory encoder-decoder models.

Cite

CITATION STYLE

APA

Zhang, Z., Yu, W., Zhu, C., & Jiang, M. (2022). A Unified Encoder-Decoder Framework with Entity Memory. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 (pp. 689–705). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.emnlp-main.43

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free