Bootstrapping a crosslingual semantic parser

17Citations
Citations of this article
83Readers
Mendeley users who have this article in their library.

Abstract

Recent progress in semantic parsing scarcely considers languages other than English but professional translation can be prohibitively expensive. We adapt a semantic parser trained on a single language, such as English, to new languages and multiple domains with minimal annotation. We query if machine translation is an adequate substitute for training data, and extend this to investigate bootstrapping using joint training with English, paraphrasing, and multilingual pre-trained models. We develop a Transformer-based parser combining paraphrases by ensembling attention over multiple encoders and present new versions of ATIS and Overnight in German and Chinese for evaluation. Experimental results indicate that MT can approximate training data in a new language for accurate parsing when augmented with paraphrasing through multiple MT engines. Considering when MT is inadequate, we also find that using our approach achieves parsing accuracy within 2% of complete translation using only 50% of training data.

Cite

CITATION STYLE

APA

Sherborne, T., Xu, Y., & Lapata, M. (2020). Bootstrapping a crosslingual semantic parser. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 499–517). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.45

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free