Transfer Learning Parallel Metaphor using Bilingual Embeddings

2Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

Abstract

Automated metaphor detection in languages other than English is highly restricted as training corpora are comparably rare. One way to overcome this problem is transfer learning. This paper gives an overview on transfer learning techniques applied to NLP. We first introduce types of transfer learning, then we present work focusing on: i) transfer learning with cross-lingual embeddings; ii) transfer learning in machine translation; and iii) transfer learning using pre-trained transformer models. The paper is complemented by first experiments that make use of bilingual embeddings generated from different sources of parallel data: We i) present the preparation of a parallel Gold corpus; ii) examine the embeddings spaces to search for metaphoric words cross-lingually; iii) run first experiments in transfer learning German metaphor from English labeled data only. Results show that finding data sources for bilingual embeddings training and the vocabulary covered by these embeddings is critical for learning metaphor cross-lingually.

Cite

CITATION STYLE

APA

Berger, M. (2022). Transfer Learning Parallel Metaphor using Bilingual Embeddings. In FLP 2022 - 3rd Workshop on Figurative Language Processing, Proceedings of the Workshop (pp. 13–23). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.flp-1.3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free