Line graph enhanced AMR-to-text generation with mix-order graph attention networks

27Citations
Citations of this article
125Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Efficient structure encoding for graphs with labeled edges is an important yet challenging point in many graph-based models. This work focuses on AMR-to-text generation - A graph-to-sequence task aiming to recover natural language from Abstract Meaning Representations (AMR). Existing graph-to-sequence approaches generally utilize graph neural networks as their encoders, which have two limitations: 1) The message propagation process in AMR graphs is only guided by the first-order adjacency information. 2) The relationships between labeled edges are not fully considered. In this work, we propose a novel graph encoding framework which can effectively explore the edge relations. We also adopt graph attention networks with higher-order neighborhood information to encode the rich structure in AMR graphs. Experiment results show that our approach obtains new state-of-the-art performance on English AMR benchmark datasets. The ablation analyses also demonstrate that both edge relations and higher-order information are beneficial to graph-to-sequence modeling.

Cite

CITATION STYLE

APA

Zhao, Y., Chen, L., Chen, Z., Cao, R., Zhu, S., & Yu, K. (2020). Line graph enhanced AMR-to-text generation with mix-order graph attention networks. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 732–741). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.67

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free