Enhancing Text Generation via Parse Tree Embedding

2Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Natural language generation (NLG) is a core component of machine translation, dialogue systems, speech recognition, summarization, and so forth. The existing text generation methods tend to be based on recurrent neural language models (NLMs), which generate sentences from encoding vector. However, most of these models lack explicit structured representation for text generation. In this work, we introduce a new generative model for NLG, called Tree-VAE. First it samples a sentence from the training corpus and then generates a new sentence based on the corresponding parse tree embedding vector. Tree-LSTM is used in collaboration with the Stanford Parser to retrieve sentence construction data, which is then used to train a conditional discretization autoencoder generator based on the embeddings of sentence patterns. The proposed model is extensively evaluated on three different datasets. The experimental results proved that the proposed model can generate substantially more diverse and coherent text than existing baseline methods.

References Powered by Scopus

Microsoft COCO: Common objects in context

28980Citations
8844Readers

This article is free to access.

Extracting and composing robust features with denoising autoencoders

6030Citations
3856Readers
Get full text
Get full text

Cited by Powered by Scopus

Research on the Generation of Patented Technology Points in New Energy Based on Deep Learning

1Citations
2Readers
0Citations
10Readers
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Duan, D., Zhang, Q., Han, Z., & Xiong, H. (2022). Enhancing Text Generation via Parse Tree Embedding. Computational Intelligence and Neuroscience, 2022. https://doi.org/10.1155/2022/4096383

Readers over time

‘18‘22‘23‘2402468

Readers' Seniority

Tooltip

Lecturer / Post doc 1

33%

PhD / Post grad / Masters / Doc 1

33%

Researcher 1

33%

Readers' Discipline

Tooltip

Computer Science 2

50%

Mathematics 1

25%

Engineering 1

25%

Save time finding and organizing research with Mendeley

Sign up for free
0