Differentiable algorithm for marginalising changepoints

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

We present an algorithm for marginalising changepoints in time-series models that assume a fixed number of unknown changepoints. Our algorithm is differentiable with respect to its inputs, which are the values of latent random variables other than changepoints. Also, it runs in time O(mn) where n is the number of time steps and m the number of changepoints, an improvement over a naive marginalisation method with O(nm) time complexity. We derive the algorithm by identifying quantities related to this marginalisation problem, showing that these quantities satisfy recursive relationships, and transforming the relationships to an algorithm via dynamic programming. Since our algorithm is differentiable, it can be applied to convert a model non-differentiable due to changepoints to a differentiable one, so that the resulting models can be analysed using gradient-based inference or learning techniques. We empirically show the effectiveness of our algorithm in this application by tackling the posterior inference problem on synthetic and real-world data.

Cite

CITATION STYLE

APA

Lim, H., Che, G., Lee, W., & Yang, H. (2020). Differentiable algorithm for marginalising changepoints. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 4828–4835). AAAI press. https://doi.org/10.1609/aaai.v34i04.5918

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free