Syntactic manipulation for generating more diverse and interesting texts

10Citations
Citations of this article
75Readers
Mendeley users who have this article in their library.

Abstract

Natural Language Generation plays an important role in the domain of dialogue systems as it determines how users perceive the system. Recently, deep-learning based systems have been proposed to tackle this task, as they generalize better and require less amounts of manual effort to implement them for new domains. However, deep learning systems usually adapt a very homogeneous sounding writing style which expresses little variation. In this work, we present our system for Natural Language Generation where we control various aspects of the surface realization in order to increase the lexical variability of the utterances, such that they sound more diverse and interesting. For this, we use a Semantically Controlled Long Short-term Memory Network (SC-LSTM), and apply its specialized cell to control various syntactic features of the generated texts. We present an in-depth human evaluation where we show the effects of these surface manipulation on the perception of potential users.

Cite

CITATION STYLE

APA

Deriu, J., & Cieliebak, M. (2018). Syntactic manipulation for generating more diverse and interesting texts. In INLG 2018 - 11th International Natural Language Generation Conference, Proceedings of the Conference (pp. 22–34). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-6503

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free