Log-bilinear language models such as SkipGram and GloVe have been proven to capture high quality syntactic and seman-tic relationships between words in a vector space. We revisit the relationship between SkipGram and GloVe models from a ma-chine learning viewpoint, and show that these two methods are easily merged into a unified form. Then, by using the unified form, we extract the factors of the config-urations that they use differently. We also empirically investigate which factor is re-sponsible for the performance difference often observed in widely examined word similarity and analogy tasks.
CITATION STYLE
Suzuki, J., & Nagata, M. (2015). A unified learning framework of skip-grams and global vectors. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 186–191). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p15-2031
Mendeley helps you to discover research relevant for your work.