Abstract
Recent successes in deep generative modeling have led to significant advances in natural language generation (NLG). Incorporating entities into neural generation models has demonstrated great improvements by assisting to infer the summary topic and to generate coherent content. To enhance the role of entity in NLG, in this paper, we aim to model the entity type in the decoding phase to generate contextual words accurately. We develop a novel NLG model to produce a target sequence based on a given list of entities. Our model has a multi-step decoder that injects the entity types into the process of entity mention generation. Experiments on two public news datasets demonstrate type injection performs better than existing type embedding concatenation baselines.
Cite
CITATION STYLE
Dong, X., Yu, W., Zhu, C., & Jiang, M. (2021). Injecting Entity Types into Entity-Guided Text Generation. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 734–741). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.56
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.