We propose an attention-based model that treats AMR parsing as sequence-to-graph transduction. Unlike most AMR parsers that rely on pre-trained aligners, external semantic resources, or data augmentation, our proposed parser is aligner-free, and it can be effectively trained with limited amounts of labeled AMR data. Our experimental results outperform all previously reported SMATCH scores, on both AMR 2.0 (76.3% F1 on LDC2017T10) and AMR 1.0 (70.2% F1 on LDC2014T12).
CITATION STYLE
Zhang, S., Ma, X., Duh, K., & van Durme, B. (2020). AMR parsing as sequence-to-graph transduction. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 80–93). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1009
Mendeley helps you to discover research relevant for your work.