Abstract
The reverse dictionary is a sequence-to-vector task in which a gloss is provided as input, and the model is trained to output a semantically matching word vector. The reverse dictionary is useful in practical applications such as solving the tip-of-the-tongue problem, helping new language learners, etc. In this paper, we evaluate the Transformer-based model with the added LSTM layer for the task at hand in a monolingual, multilingual, and cross-lingual zero-shot setting. Experiments are conducted in five languages in the CODWOE dataset, namely English, French, Italian, Spanish, and Russian. Our work partially improves the current baseline of the CODWOE competition and offers insight into the feasibility of the cross-lingual methodology for the reverse dictionary task. The code is available at https://github.com/honghanhh/codwoe2021.
Cite
CITATION STYLE
Tran, H. T. H., Martinc, M., Purver, M., & Pollak, S. (2022). JSI at SemEval-2022 Task 1: CODWOE - Reverse Dictionary: Monolingual, multilingual, and cross-lingual approaches. In SemEval 2022 - 16th International Workshop on Semantic Evaluation, Proceedings of the Workshop (pp. 101–106). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.semeval-1.12
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.