Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods. These approaches linearise the input graph to be fed to a recurrent neural network. In this paper, we propose an alternative encoder based on graph convolutional networks that directly exploits the input structure. We report results on two graph-to-sequence datasets that empirically show the benefits of explicitly encoding the input graph structure.
CITATION STYLE
Marcheggiani, D., & Perez-Beltrachini, L. (2018). Deep graph convolutional encoders for structured data to text generation. In INLG 2018 - 11th International Natural Language Generation Conference, Proceedings of the Conference (pp. 1–9). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-6501
Mendeley helps you to discover research relevant for your work.