We propose a tensor based approach to infer causal structures from time series. An information theoretical analysis of transfer entropy (TE) shows that TE results from transmission of information over a set of communication channels. Tensors are the mathematical equivalents of these multichannel causal channels. The total effect of subsequent transmissions, i.e., the total effect of a cascade, can now be expressed in terms of the tensors of these subsequent transmissions using tensor multiplication. With this formalism, differences in the underlying structures can be detected that are otherwise undetectable using TE or mutual information. Additionally, using a system comprising three variables, we prove that bivariate analysis suffices to infer the structure, that is, bivariate analysis suffices to differentiate between direct and indirect associations. Some results translate to TE. For example, a Data Processing Inequality (DPI) is proven to exist for transfer entropy.
CITATION STYLE
Sigtermans, D. (2020). Towards a framework for observational causality from time series: When shannon meets turing. Entropy, 22(4). https://doi.org/10.3390/E22040426
Mendeley helps you to discover research relevant for your work.