Abstract
In this paper, we propose a novel stochastic fractional Hamiltonian Monte Carlo approach which generalizes the Hamiltonian Monte Carlo method within the framework of fractional calculus and Lévy diffusion. Due to the large “jumps” introduced by Lévy noise and momentum term, the proposed dynamics is capable of exploring the parameter space more efficiently and effectively. We have shown that the fractional Hamiltonian Monte Carlo could sample the multi-modal and high-dimensional target distribution more efficiently than the existing methods driven by Brownian diffusion. We further extend our method for optimizing deep neural networks. The experimental results show that the proposed stochastic fractional Hamiltonian Monte Carlo for training deep neural networks could converge faster than other popular optimization schemes and generalize better.
Cite
CITATION STYLE
Ye, N., & Zhu, Z. (2018). Stochastic fractional hamiltonian monte carlo. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 3019–3025). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/419
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.