Abstract
Semantic parsing is one of the key components of natural language understanding systems. A successful parse transforms an input utterance to an action that is easily understood by the system. Many algorithms have been proposed to solve this problem, from conventional rule-based or statistical slot-filling systems to shift-reduce based neural parsers. For complex parsing tasks, the state-of-the-art method is based on autoregressive sequence to sequence models to generate the parse directly. This model is slow at inference time, generating parses in O(n) decoding steps (n is the length of the target sequence). In addition, we demonstrate that this method performs poorly in zero-shot cross-lingual transfer learning settings. In this paper, we propose a non-autoregressive parser which is based on the insertion transformer to overcome these two issues. Our approach 1) speeds up decoding by 3x while outperforming the autoregressive model and 2) significantly improves cross-lingual transfer in the low-resource setting by 37% compared to autoregressive baseline. We test our approach on three well-known monolingual datasets: ATIS, SNIPS and TOP. For cross lingual semantic parsing, we use the MultiATIS++ and the multilingual TOP datasets.
Cite
CITATION STYLE
Zhu, Q., Khan, H., Soltan, S., Rawls, S., & Hamza, W. (2020). Don’t Parse, Insert: Multilingual Semantic Parsing with Insertion Based Decoding. In CoNLL 2020 - 24th Conference on Computational Natural Language Learning, Proceedings of the Conference (pp. 496–506). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.conll-1.40
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.