Morph Call: Probing Morphosyntactic Content of Multilingual Transformers

5Citations
Citations of this article
50Readers
Mendeley users who have this article in their library.

Abstract

The outstanding performance of transformer-based language models on a great variety of NLP and NLU tasks has stimulated interest in exploring their inner workings. Recent research has focused primarily on higher-level and complex linguistic phenomena such as syntax, semantics, world knowledge, and common sense. The majority of the studies are anglocentric, and little remains known regarding other languages, precisely their morphosyntactic properties. To this end, our work presents Morph Call, a suite of 46 probing tasks for four Indo-European languages of different morphology: English, French, German and Russian. We propose a new type of probing task based on the detection of guided sentence perturbations. We use a combination of neuron-, layer- and representation-level introspection techniques to analyze the morphosyntactic content of four multilingual transformers, including their less explored distilled versions. Besides, we examine how fine-tuning for POS-tagging affects the model knowledge. The results show that fine-tuning can improve and decrease the probing performance and change how morphosyntactic knowledge is distributed across the model. The code and data are publicly available, and we hope to fill the gaps in the less studied aspect of transformers.

Cite

CITATION STYLE

APA

Mikhailov, V., Serikov, O., & Artemova, E. (2021). Morph Call: Probing Morphosyntactic Content of Multilingual Transformers. In SIGTYP 2021 - 3rd Workshop on Research in Computational Typology and Multilingual NLP, Proceedings of the Workshop (pp. 97–121). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.sigtyp-1.10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free