Semi-supervised neural text generation by joint learning of natural language generation and natural language understanding models

17Citations
Citations of this article
98Readers
Mendeley users who have this article in their library.

Abstract

In Natural Language Generation (NLG), End-to-End (E2E) systems trained through deep learning have recently gained a strong interest. Such deep models need a large amount of carefully annotated data to reach satisfactory performance. However, acquiring such datasets for every new NLG application is a tedious and time-consuming task. In this paper, we propose a semi-supervised deep learning scheme that can learn from non-annotated data and annotated data when available. It uses an NLG and a Natural Language Understanding (NLU) sequence-to-sequence models which are learned jointly to compensate for the lack of annotation. Experiments on two benchmark datasets show that, with limited amount of annotated data, the method can achieve very competitive results while not using any pre-processing or re-scoring tricks. These findings open the way to the exploitation of non-annotated datasets which is the current bottleneck for the E2E NLG system development to new applications.

Cite

CITATION STYLE

APA

Qader, R., Portet, F., & Labbé, C. (2019). Semi-supervised neural text generation by joint learning of natural language generation and natural language understanding models. In INLG 2019 - 12th International Conference on Natural Language Generation, Proceedings of the Conference (pp. 552–562). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/W19-8669

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free