Stochastic variational hierarchical mixture of sparse Gaussian processes for regression

4Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this article, we propose a scalable Gaussian process (GP) regression method that combines the advantages of both global and local GP approximations through a two-layer hierarchical model using a variational inference framework. The upper layer consists of a global sparse GP to coarsely model the entire data set, whereas the lower layer comprises a mixture of sparse GP experts which exploit local information to learn a fine-grained model. A two-step variational inference algorithm is developed to learn the global GP, the GP experts and the gating network simultaneously. Stochastic optimization can be employed to allow the application of the model to large-scale problems. Experiments on a wide range of benchmark data sets demonstrate the flexibility, scalability and predictive power of the proposed method.

Cite

CITATION STYLE

APA

Nguyen, T. N. A., Bouzerdoum, A., & Phung, S. L. (2018). Stochastic variational hierarchical mixture of sparse Gaussian processes for regression. Machine Learning, 107(12), 1947–1986. https://doi.org/10.1007/s10994-018-5721-5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free