Restricted Boltzmann Machines are generative models which can be used as standalone feature extractors, or as a parameter initialization for deeper models. Typically, these models are trained using Contrastive Divergence algorithm, an approximation of the stochastic gradient descent method. In this paper, we aim at speeding up the convergence of the learning procedure by applying the momentum method and the Nesterov's accelerated gradient technique. We evaluate these two techniques empirically using the image dataset MNIST. © Springer International Publishing Switzerland 2015.
CITATION STYLE
Zareba, S., Gonczarek, A., Tomczak, J. M., & Światek, J. (2015). Accelerated learning for Restricted Boltzmann Machine with momentum term. In Advances in Intelligent Systems and Computing (Vol. 1089, pp. 187–192). Springer Verlag. https://doi.org/10.1007/978-3-319-08422-0_28
Mendeley helps you to discover research relevant for your work.