Coupling Context Modeling with Zero Pronoun Recovering for Document-Level Natural Language Generation

6Citations
Citations of this article
49Readers
Mendeley users who have this article in their library.

Abstract

Natural language generation (NLG) tasks on pro-drop languages are known to suffer from zero pronoun (ZP) problems, and the problems remain challenging due to the scarcity of ZP-annotated NLG corpora. In this case, we propose a highly adaptive two-stage approach to couple context modeling with ZP recovering to mitigate the ZP problem in NLG tasks. Notably, we frame the recovery process in a task-supervised fashion where the ZP representation recovering capability is learned during the NLG task learning process, thus our method does not require NLG corpora annotated with ZPs. For system enhancement, we learn an adversarial bot to adjust our model outputs to alleviate the error propagation caused by mis-recovered ZPs. Experiments on three document-level NLG tasks, i.e., machine translation, question answering, and summarization, show that our approach can improve the performance to a great extent, and the improvement on pronoun translation is very impressive.

Cite

CITATION STYLE

APA

Tan, X., Zhang, L., & Zhou, G. (2021). Coupling Context Modeling with Zero Pronoun Recovering for Document-Level Natural Language Generation. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 2530–2540). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.197

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free