Scaling semi-supervised Naive Bayes with feature marginals

6Citations
Citations of this article
98Readers
Mendeley users who have this article in their library.

Abstract

Semi-supervised learning (SSL) methods augment standard machine learning (ML) techniques to leverage unlabeled data. SSL techniques are often effective in text classification, where labeled data is scarce but large unlabeled corpora are readily available. However, existing SSL techniques typically require multiple passes over the entirety of the unlabeled data, meaning the techniques are not applicable to large corpora being produced today. In this paper, we show that improving marginal word frequency estimates using unlabeled data can enable semi-supervised text classification that scales to massive unlabeled data sets. We present a novel learning algorithm, which optimizes a Naive Bayes model to accord with statistics calculated from the unlabeled corpus. In experiments with text topic classification and sentiment analysis, we show that our method is both more scalable and more accurate than SSL techniques from previous work. © 2013 Association for Computational Linguistics.

Cite

CITATION STYLE

APA

Lucas, M. R., & Downey, D. (2013). Scaling semi-supervised Naive Bayes with feature marginals. In ACL 2013 - 51st Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Vol. 1, pp. 343–351). Association for Computational Linguistics (ACL).

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free