Stochastic complexity for mixture of exponential families in Variational Bayes

3Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Variational Bayesian learning, proposed as an approximation of the Bayesian learning, has provided computational tractability and good generalization performance in many applications. However, little has been done to investigate its theoretical properties. In this paper, we discuss the Variational Bayesian learning of the mixture of exponential families and derive the asymptotic form of the stochastic complexities. We show that the stochastic complexities become smaller than those of regular statistical models, which implies the advantage of the Bayesian learning still remains in the Variational Bayesian learning. Stochastic complexity, which is called the marginal likelihood or the free energy, not only becomes important in addressing the model selection problem but also enables us to discuss the accuracy of the Variational Bayesian approach as an approximation of the true Bayesian learning. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Watanabe, K., & Watanabe, S. (2005). Stochastic complexity for mixture of exponential families in Variational Bayes. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3734 LNAI, pp. 107–121). https://doi.org/10.1007/11564089_10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free