Gaussian process regression networks (GPRN) are powerful Bayesian models for multi-output regression, but their inference is intractable. To address this issue, existing methods use a fully factorized structure (or a mixture of such structures) over all the outputs and latent functions for posterior approximation, which, however, can miss the strong posterior dependencies among the latent variables and hurt the inference quality. In addition, the updates of the variational parameters are inefficient and can be prohibitively expensive for a large number of outputs. To overcome these limitations, we propose a scalable variational inference algorithm for GPRN, which not only captures the abundant posterior dependencies but also is much more efficient for massive outputs. We tensorize the output space and introduce tensor/matrix-normal variational posteriors to capture the posterior correlations and to reduce the parameters. We jointly optimize all the parameters and exploit the inherent Kronecker product structure in the variational model evidence lower bound to accelerate the computation. We demonstrate the advantages of our method in several real-world applications.
CITATION STYLE
Li, S., Xing, W., Kirby, R. M., & Zhe, S. (2020). Scalable Gaussian process regression networks. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 2456–2462). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/340
Mendeley helps you to discover research relevant for your work.