Analysis of latent Dirichlet allocation and non-negative matrix factorization using latent semantic indexing

1Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

A word is a major attribute in the field of opinion/text mining. Based on this attribute, it is decided that whether it is a keyword, aspect, feature, entity, title, or topic? Lots of work has been done to detect such targets using both supervised and unsupervised approaches. These targets can be used in further processing such as text analytics, sentiment analysis, information retrieval, and searches, etc. Latent Dirichlet allocation (LDA) and nonnegative matrix factorization (NMF) are the major models used for detecting topics. Understanding the depth and details of them algorithms are necessary for those who want to extend these models. The research community of opinion/text mining uses them as a black box. However, there is a question about which model is the most accurate for detecting topics. Latent semantic indexing (LSI) is the best approach for detecting the best match for document in a given query. In this study, we analyzed the LDA and NMF models using LSI to determine the best model for opinion/text mining and found that both are very good, but NMF is slightly better than LDA.

Cite

CITATION STYLE

APA

Saqib, S. M., Ahmad, S., Syed, A. H., Naeem, T., & Alotaibi, F. M. (2019). Analysis of latent Dirichlet allocation and non-negative matrix factorization using latent semantic indexing. International Journal of Advanced and Applied Sciences, 6(10), 94–102. https://doi.org/10.21833/ijaas.2019.10.015

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free