Deep graph convolutional encoders for structured data to text generation

83Citations
Citations of this article
199Readers
Mendeley users who have this article in their library.

Abstract

Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods. These approaches linearise the input graph to be fed to a recurrent neural network. In this paper, we propose an alternative encoder based on graph convolutional networks that directly exploits the input structure. We report results on two graph-to-sequence datasets that empirically show the benefits of explicitly encoding the input graph structure.

Cite

CITATION STYLE

APA

Marcheggiani, D., & Perez-Beltrachini, L. (2018). Deep graph convolutional encoders for structured data to text generation. In INLG 2018 - 11th International Natural Language Generation Conference, Proceedings of the Conference (pp. 1–9). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-6501

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free