drsphelps at SemEval-2022 Task 2: Learning idiom representations using BERTRAM

3Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

This paper describes our system for SemEval-2022 Task 2 Multilingual Idiomaticity Detection and Sentence Embedding sub-task B. We modify a standard BERT sentence transformer by adding embeddings for each idioms, which are created using BERTRAM and a small number of contexts. We show that this technique increases the quality of idiom representations and leads to better performance on the task. We also perform analysis on our final results and show that the quality of the produced idiom embeddings is highly sensitive to the quality of the input contexts.

Cite

CITATION STYLE

APA

Phelps, D. (2022). drsphelps at SemEval-2022 Task 2: Learning idiom representations using BERTRAM. In SemEval 2022 - 16th International Workshop on Semantic Evaluation, Proceedings of the Workshop (pp. 158–164). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.semeval-1.18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free