Topic Modeling and Word Embeddings

  • Beysolow II T
N/ACitations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper discusses approaches for static topic modeling, in particular an improved method based on topic parametrization from a continuous distribution over the space of word embeddings. Word embeddings corpora proves to reflect semantic interdependences. Thus, we incorporate vectorized word representations trained with Word2Vec neural network in a generative process of topic modeling. The alternative approach with beta approximation of mutual information distribution over embeddings is proposed and compared with vanilla LDA and Gaussian LDA methods. Copyright ©2018 for the individual papers by the papers' authors.

Cite

CITATION STYLE

APA

Beysolow II, T. (2018). Topic Modeling and Word Embeddings. In Applied Natural Language Processing with Python (pp. 77–119). Apress. https://doi.org/10.1007/978-1-4842-3733-5_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free