The inference of network representations that capture causal relations in time series is a challenging problem. In this paper, we explore the use of information theoretic tools for characterising information flow between time series, and how to infer networks representing time series data. We explore two different approaches. The first uses transfer entropy as a means of characterising information flow and measures network similarity using Jensen-Shannon divergence. The second uses time series correlation and used Kullback-Leibler divergence to compare the distribution of correlations across edges for different networks. We explore how both weighted and unweighted representations derived from these two characterisations perform on real-world time series data. Experiments on time series data for the New York Stock Exchange show that transfer entropy results in better localisation of temporal anomalies in graph time series. Moreover, the method leads to embeddings of network time series that better preserve their temporal order.
CITATION STYLE
Caglar, I., & Hancock, E. R. (2019). Network Time Series Analysis Using Transfer Entropy. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11510 LNCS, pp. 194–203). Springer Verlag. https://doi.org/10.1007/978-3-030-20081-7_19
Mendeley helps you to discover research relevant for your work.