Variational em learning of DSBNs with conditional deep boltzmann machines

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Variational EM (VEM) is an efficient parameter learning scheme for sigmoid belief networks with many layers of latent variables. The choice of the inference model that forms the variational lower bound of the log likelihood is critical in VEM learning. The mean field approximations and wake-sleep algorithm use simple models that are computationally efficient, but may be poor approximations to the true posterior densities when the latent variables have strong mutual dependencies. In this paper, we describe a variational EM learning method of DSBNs with a new inference model known as the conditional deep Boltzmann machine (cDBM), which is an undirected graphical model capable of representing complex dependencies among latent variables. We show that this algorithm does not require the computation of the intractable partition function in the undirected cDBM model, and can be accelerated with contrastive learning. Performances of the proposed method are evaluated and compared on handwritten digit data. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Zhang, X., & Lyu, S. (2014). Variational em learning of DSBNs with conditional deep boltzmann machines. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8681 LNCS, pp. 257–264). Springer Verlag. https://doi.org/10.1007/978-3-319-11179-7_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free