A specialized probability density function for the input of mixture of Gaussian processes

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Mixture of Gaussian Processes (MGP) is a generative model being powerful and widely used in the fields of machine learning and data mining. However, when we learn this generative model on a given dataset, we should set the probability density function (pdf) of the input in advance. In general, it can be set as a Gaussian distribution. But, for some actual data like time series, this setting or assumption is not reasonable and effective. In this paper, we propose a specialized pdf for the input of MGP model which is a piecewise-defined continuous function with three parts such that the middle part takes the form of a uniform distribution, while the two side parts take the form of Gaussian distribution. This specialized pdf is more consistent with the uniform distribution of the input than the Gaussian pdf. The two tails of the pdf with the form of a Gaussian distribution ensure the effectiveness of the iteration of the hard-cut EM algorithm for MGPs. It demonstrated by the experiments on the simulation and stock datasets that the MGP model with these specialized pdfs can lead to a better result on time series prediction in comparison with the general MGP models as well as the other classical regression methods.

Cite

CITATION STYLE

APA

Zhao, L., & Ma, J. (2018). A specialized probability density function for the input of mixture of Gaussian processes. In IFIP Advances in Information and Communication Technology (Vol. 539, pp. 70–80). Springer New York LLC. https://doi.org/10.1007/978-3-030-01313-4_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free