Abstract
Contextualized word embeddings have boosted many NLP tasks compared with traditional static word embeddings. However, the word with a specific sense may have different contextualized embeddings due to its various contexts. To further investigate what contextualized word embeddings capture, this paper analyzes whether they can indicate the corresponding sense definitions and proposes a general framework that is capable of explaining word meanings given contextualized word embeddings for better interpretation. The experiments show that both ELMo and BERT embeddings can be well interpreted via a readable textual form, and the findings may benefit the research community for a better understanding of what the embeddings capture1.
Cite
CITATION STYLE
Chang, T. Y., & Chen, Y. N. (2019). What does this word mean? Explaining contextualized embeddings with natural language definition. In EMNLP-IJCNLP 2019 - 2019 Conference on Empirical Methods in Natural Language Processing and 9th International Joint Conference on Natural Language Processing, Proceedings of the Conference (pp. 6064–6070). Association for Computational Linguistics. https://doi.org/10.18653/v1/D19-1627
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.