Stochastic bounds for inference in topic models

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Topic models are popular for modeling discrete data (e.g., texts, images, videos, links), and provide an efficient way to discover hidden structures/semantics in massive data. The problem of posterior inference for individual texts is particularly important in streaming environments, but often intractable in the worst case. Some existing methods for posterior inference are approximate but do not have any guarantee on neither quality nor convergence rate. Online Maximum a Posterior Estimation algorithm (OPE) [13] has more attractive properties than existing inference approaches, including theoretical guarantees on quality and fast convergence rate. In this paper, we introduce three new algorithms to improve OPE (so called OPE1, OPE2, OPE3) by using stochastic bounds when doing inference. Our algorithms not only maintain the key advantages of OPE but often outperform OPE and existing algorithms. Our new algorithms have been employed to develop new effective methods for learning topic models from massive/streaming text collections.

Cite

CITATION STYLE

APA

Bui, X., Vu, T., & Than, K. (2017). Stochastic bounds for inference in topic models. In Advances in Intelligent Systems and Computing (Vol. 538 AISC, pp. 582–592). Springer Verlag. https://doi.org/10.1007/978-3-319-49073-1_62

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free