Mini-batch variational inference for time-aware topic modeling

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper proposes a time-aware topic model and its mini-batch variational inference for exploring chronological trends in document contents. Our contribution is twofold. First, to extract topics in a time-aware manner, our method uses two vector embeddings: the embedding of latent topics and that of document timestamps. By combining these two embeddings and applying the softmax function, we have as many word probability distributions as document timestamps for each topic. This modeling enables us to extract remarkable topical trends. Second, to achieve memory efficiency, the variational inference is implemented as mini-batch gradient ascent maximizing the evidence lower bound. This enables us to perform parameter estimation in the way similar to neural networks. Our method was actually implemented with deep learning framework. The evaluation results show that we could improve test set perplexity by using document timestamps and also that our test perplexity was comparable with that of collapsed Gibbs sampling, which is less efficient in memory usage than the proposed inference.

Cite

CITATION STYLE

APA

Masada, T., & Takasu, A. (2018). Mini-batch variational inference for time-aware topic modeling. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11013 LNAI, pp. 156–164). Springer Verlag. https://doi.org/10.1007/978-3-319-97310-4_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free