Recently, Lifted Marginal Filtering has been proposed, an efficient Bayesian filtering algorithm for stochastic systems consisting of multiple, (inter-)acting agents and objects (entities). The algorithm achieves its efficiency by performing inference jointly over groups of similar entities (i.e. their properties follow the same distribution). In this paper, we explore the case where there are no entities that are directly suitable for grouping, which is typical for many real-world scenarios. We devise a mechanism to identify entity groups, by formulating the distribution that is described by the grouped representation as a mixture distribution, such that the parameters can be fitted by Expectation Maximization. Specifically, in this paper, we investigate the Gaussian mixture case. Furthermore, we show how Gaussian mixture merging methods can be used to prevent the number of groups from growing indefinitely over time. We evaluate our approach on an activity prediction task in an online multiplayer game. The results suggest that compared to the conventional approach, where all entities are handled individually, decrease in prediction accuracy is small, while inference runtime decreases significantly.
CITATION STYLE
Lüdtke, S., Molina, A., Kersting, K., & Kirste, T. (2019). Gaussian Lifted Marginal Filtering. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11793 LNAI, pp. 230–243). Springer Verlag. https://doi.org/10.1007/978-3-030-30179-8_19
Mendeley helps you to discover research relevant for your work.