Abstract
The dominant paradigm for semantic parsing in recent years is to formulate parsing as a sequence-to-sequence task, generating predictions with auto-regressive sequence decoders. In this work, we explore an alternative paradigm. We formulate semantic parsing as a dependency parsing task, applying graphbased decoding techniques developed for syntactic parsing. We compare various decoding techniques given the same pre-trained Transformer encoder on the TOP dataset, including settings where training data is limited or contains only partially-annotated examples. We find that our graph-based approach is competitive with sequence decoders on the standard setting, and offers significant improvements in data efficiency and settings where partiallyannotated data is available.
Cite
CITATION STYLE
Cole, J. R., Jiang, N., Pasupat, P., He, L., & Shaw, P. (2021). Graph-Based Decoding for Task Oriented Semantic Parsing. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 4057–4065). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.341
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.