Abstract
One dimensional versions of the Markov chain and the hidden Markov model have been generalized as Gaussian processes. Currently these approaches support only a single dimension which is limiting their usability. In this paper we encode the more general dynamic Gaussian Bayesian network as a Gaussian process and thus allow arbitrary number of dimensions and arbitrary connections between time steps. Our developed Gaussian process based formalism has the advantage of supporting a direct inference from any time point to the other without propagation of evidence throughout the whole network, flexibility to combine the covariance function with others if needed and keeping all properties of the dynamic Gaussian Bayesian network.
Author supplied keywords
Cite
CITATION STYLE
Hartwig, M., & Möller, R. (2020). How to Encode Dynamic Gaussian Bayesian Networks as Gaussian Processes? In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12576 LNAI, pp. 371–382). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-64984-5_29
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.