Introducing off-diagonal elements to singular value matrix in probabilistic Latent Semantic Indexing

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Probabilistic Latent Semantic Indexing (pLSI) is a fundamental method for the analysis of text and related resources which is based on a simple statistical model. This method has high extendibility and scalability due to its simplicity. pLSI is also known as matrix factorization method such as Singular Value Decomposition(SVD) or Nonnegative Matrix Factorization. Using pLSI, three matrices which include one diagonal matrix as SVD are achieved. The diagonal elements of this diagonal matrix represent singular values in SVD. However it is not entirely clear what the diagonal matrix of pLSI represents. Then it is also unclear whether the diagonalization constraint is necessary in pLSI. This question is the starting point of this paper. To make an answer for this question, we demonstrated that introducing off-diagonal elements to singular value matrix in pLSI is equal to permitting joint probability between different hidden variables. Although permitting joint probability in pLSI does not lose scalability and simplicity, our experiments demonstrated that this extension showed tolerance for over-learning and over-fitting problems.

Cite

CITATION STYLE

APA

Shibayama, N., & Nakagawa, H. (2011). Introducing off-diagonal elements to singular value matrix in probabilistic Latent Semantic Indexing. Transactions of the Japanese Society for Artificial Intelligence, 26(1), 262–272. https://doi.org/10.1527/tjsai.26.262

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free