Abstract
Recent methods for learning word embeddings, like GloVe or Word2- Vec, succeeded in spatial representation of semantic and syntactic relations. We extend GloVe by introducing separate vectors for base form and grammatical form of a word, using morphosyntactic dictionary for this. This allows vectors to capture properties of words better. We also present model results for word analogy test and introduce a new test based on WordNet.
Author supplied keywords
Cite
CITATION STYLE
APA
Jurdziński, G. (2016). Word embeddings for morphologically complex languages. Schedae Informaticae, 25, 127–138. https://doi.org/10.4467/20838476SI.16.010.6191
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.
Already have an account? Sign in
Sign up for free