Words are the building blocks of phrases, sentences, and documents. Word representation is thus critical for natural language processing (NLP). In this chapter, we introduce the approaches for word representation learning to show the paradigm shift from symbolic representation to distributed representation. We also describe the valuable efforts in making word representations more informative and interpretable. Finally, we present applications of word representation learning to NLP and interdisciplinary fields, including psychology, social sciences, history, and linguistics.
CITATION STYLE
Hu, S., Liu, Z., Lin, Y., & Sun, M. (2023). Word Representation Learning. In Representation Learning for Natural Language Processing, Second Edition (pp. 29–68). Springer. https://doi.org/10.1007/978-981-99-1600-9_2
Mendeley helps you to discover research relevant for your work.