We present a novel Metropolis-Hastings method for large datasets that uses small expected-size minibatches of data. Previous work on reducing the cost of Metropolis-Hastings tests yields only constant factor reductions versus using the full dataset for each sample. Here we present a method that can be tuned to provide arbitrarily small batch sizes, by adjusting either proposal step size or temperature. Our test uses the noise-tolerant Barker acceptance test with a novel additive correction variable. The resulting test has similar cost to a normal SGD update. Our experiments demonstrate several order-of-magnitude speedups over previous work.
CITATION STYLE
Seita, D., Pan, X., Chen, H., & Canny, J. (2018). An efficient minibatch acceptance test for metropolis-hastings. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2018-July, pp. 5359–5363). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2018/753
Mendeley helps you to discover research relevant for your work.