Properties and Bayesian fitting of restricted Boltzmann machines

2Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A restricted Boltzmann machine (RBM) is an undirected graphical model constructed for discrete or continuous random variables, with two layers, one hidden and one visible, and no conditional dependency within a layer. In recent years, RBMs have risen to prominence due to their connection to deep learning. By treating a hidden layer of one RBM as the visible layer in a second RBM, a deep architecture can be created. RBMs thereby are thought to have the ability to encode very complex and rich structures in data, making them attractive for supervised learning. However, the generative behavior of RBMs largely is unexplored and typical fitting methodology does not easily allow for uncertainty quantification in addition to point estimates. In this paper, we discuss the relationship between RBM parameter specification in the binary case and model properties such as degeneracy, instability and uninterpretability. We also describe the associated difficulties that can arise with likelihood-based inference and further discuss the potential Bayes fitting of such (highly flexible) models, especially as Gibbs sampling (quasi-Bayes) methods often are advocated for the RBM model structure.

Cite

CITATION STYLE

APA

Kaplan, A., Nordman, D., & Vardeman, S. (2019). Properties and Bayesian fitting of restricted Boltzmann machines. Statistical Analysis and Data Mining, 12(1), 23–38. https://doi.org/10.1002/sam.11396

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free