Learning about Word Vector Representations and Deep Learning through Implementing Word2vec

2Citations
Citations of this article
43Readers
Mendeley users who have this article in their library.

Abstract

Word vector representations are an essential part of an NLP curriculum. Here, we describe a homework that has students implement a popular method for learning word vectors, word2vec. Students implement the core parts of the method, including text preprocessing, negative sampling, and gradient descent. Starter code provides guidance and handles basic operations, which allows students to focus on the conceptually challenging aspects. After generating their vectors, students evaluate them using qualitative and quantitative tests.

Cite

CITATION STYLE

APA

Jurgens, D. (2021). Learning about Word Vector Representations and Deep Learning through Implementing Word2vec. In Teaching NLP 2021 - Proceedings of the 5th Workshop on Teaching Natural Language Processing (pp. 108–111). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.teachingnlp-1.19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free