In previous similarity-based WSD systems, studies have allocated much effort on learning comprehensive sense embeddings using contextual representations and knowledge sources. However, the context embedding of an ambiguous word is learned using only the sentence where the word appears, neglecting its global context. In this paper, we investigate the contribution of both word-level and sense-level global context of an ambiguous word for disambiguation. Experiments have shown that the Context-Oriented Embedding (COE) can enhance a similarity-based system's performance on WSD by relatively large margins, achieving state-of-the-art on all-words WSD benchmarks in knowledge-based category.
CITATION STYLE
Wang, M., Zhang, J., & Wang, Y. (2021). Enhancing the Context Representation in Similarity-based Word Sense Disambiguation. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 8965–8973). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.706
Mendeley helps you to discover research relevant for your work.