The authors introduce a continuous stochastic generative model that can model continuous data, with a simple and reliable training algorithm. The architecture is a continuous restricted Boltzmann machine, with one step of Gibbs sampling, to minimise contrastive divergence, replacing a time-consuming relaxation search. With a small approximation, the training algorithm requires only addition and multiplication and is thus computationally inexpen- sive in both software and hardware. The capabilities of the model are demonstrated and explored with both artificial and real data.
Mendeley saves you time finding and organizing research
Choose a citation style from the tabs below