In hierarchical learning machines such as neural networks, Bayesian learning provides better generalization performance than maximum likelihood estimation. However, its accurate approximation using Markov chain Monte Carlo (MCMC) method requires huge computational cost. The exchange Monte Carlo (EMC) method was proposed as an improved algorithm of MCMC method. Although its effectiveness has been shown not only in Bayesian learning but also in many fields, the mathematical foundation of EMC method has not yet been established. In this paper, we clarify the asymptotic behavior of symmetrized Kull-back divergence and average exchange ratio, which are used as criteria for designing the EMC method. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
Nagata, K., & Watanabe, S. (2007). Algebraic geometric study of exchange Monte Carlo method. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4668 LNCS, pp. 687–696). Springer Verlag. https://doi.org/10.1007/978-3-540-74690-4_70
Mendeley helps you to discover research relevant for your work.