Deep Boltzmann machines using adaptive temperatures

4Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep learning has been considered a hallmark in a number of applications recently. Among those techniques, the ones based on Restricted Boltzmann Machines have attracted a considerable attention, since they are energy-driven models composed of latent variables that aim at learning the probability distribution of the input data. In a nutshell, the training procedure of such models concerns the minimization of the energy of each training sample in order to increase its probability. Therefore, such optimization process needs to be regularized in order to reach the best trade-off between exploitation and exploration. In this work, we propose an adaptive regularization approach based on temperatures, and we show its advantages considering Deep Belief Networks (DBNs) and Deep Boltzmann Machines (DBMs). The proposed approach is evaluated in the context of binary image reconstruction, thus outperforming temperature-fixed DBNs and DBMs.

Cite

CITATION STYLE

APA

Passos Júnior, L. A., Costa, K. A. P., & Papa, J. P. (2017). Deep Boltzmann machines using adaptive temperatures. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10424 LNCS, pp. 172–183). Springer Verlag. https://doi.org/10.1007/978-3-319-64689-3_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free