Efficient Learning for undirected topic models

1Citations
Citations of this article
97Readers
Mendeley users who have this article in their library.

Abstract

Replicated Softmax model, a well-known undirected topic model, is powerful in ex-tracting semantic representations of docu-ments. Traditional learning strategies such as Contrastive Divergence are very inef-ficient. This paper provides a novel esti-mator to speed up the learning based on Noise Contrastive Estimate, extended for documents of variant lengths and weighted inputs. Experiments on two benchmarks show that the new estimator achieves great learning efficiency and high accuracy on document retrieval and classification.

Cite

CITATION STYLE

APA

Gu, J., & Li, V. O. K. (2015). Efficient Learning for undirected topic models. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 162–167). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p15-2027

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free