Combining local and global features in supervised word sense disambiguation

N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Word Sense Disambiguation (WSD) is a task to identify the sense of a polysemy in given context. Recently, word embeddings are applied to WSD, as additional input features of a supervised classifier. However, previous approaches narrowly use word embeddings to represent surrounding words of target words. They may not make sufficient use of word embeddings in representing different features like dependency relations, word order and global contexts (the whole document). In this work, we combine local and global features to perform WSD. We explore utilizing word embeddings to leverage word order and dependency features. We also use word embeddings to represent global contexts as global features. We conduct experiments to evaluate our methods and find out that our methods outperform the state-of-the-art methods on Lexical Sample WSD datasets.

Cite

CITATION STYLE

APA

Lei, X., Cai, Y., Li, Q., Xie, H., Leung, H. fung, & Wang, F. L. (2017). Combining local and global features in supervised word sense disambiguation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10570 LNCS, pp. 117–131). Springer Verlag. https://doi.org/10.1007/978-3-319-68786-5_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free