In this paper, we present a transition system that generalizes transition-based dependency parsing techniques to generate AMR graphs rather than tree structures. In addition to a buffer and a stack, we use a fixed-size cache, and allow the system to build arcs to any vertices present in the cache at the same time. The size of the cache provides a parameter that can trade off between the complexity of the graphs that can be built and the ease of predicting actions during parsing. Our results show that a cache transition system can cover almost all AMR graphs with a small cache size, and our end-to-end system achieves competitive results in comparison with other transition-based approaches for AMR parsing.
CITATION STYLE
Peng, X., Gildea, D., & Satta, G. (2018). AMR parsing with cache transition systems. In 32nd AAAI Conference on Artificial Intelligence, AAAI 2018 (pp. 4897–4904). AAAI press. https://doi.org/10.1609/aaai.v32i1.11922
Mendeley helps you to discover research relevant for your work.