Are Structural Concepts Universal in Transformer Language Models? Towards Interpretable Cross-Lingual Generalization

3Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

Large language models (LLMs) have exhibited considerable cross-lingual generalization abilities, whereby they implicitly transfer knowledge across languages. However, the transfer is not equally successful for all languages, especially for low-resource ones, which poses an ongoing challenge. It is unclear whether we have reached the limits of implicit cross-lingual generalization and if explicit knowledge transfer is viable. In this paper, we investigate the potential for explicitly aligning conceptual correspondence between languages to enhance cross-lingual generalization. Using the syntactic aspect of language as a testbed, our analyses of 43 languages reveal a high degree of alignability among the spaces of structural concepts within each language for both encoder-only and decoder-only LLMs. We then propose a meta-learning-based method to learn to align conceptual spaces of different languages, which facilitates zero-shot and few-shot generalization in concept classification and also offers insights into the cross-lingual in-context learning phenomenon. Experiments on syntactic analysis tasks show that our approach achieves competitive results with state-of-the-art methods and narrows the performance gap between languages, particularly benefiting those with limited resources.

Cite

CITATION STYLE

APA

Xu, N., Zhang, Q., Ye, J., Zhang, M., & Huang, X. (2023). Are Structural Concepts Universal in Transformer Language Models? Towards Interpretable Cross-Lingual Generalization. In Findings of the Association for Computational Linguistics: EMNLP 2023 (pp. 13951–13976). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-emnlp.931

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free