Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice. One class of methods uses data simulated with different parameters to infer models of the likelihood-to-evidence ratio, or equivalently the posterior function. Here we frame the inference task as an estimation of an energy function parametrized with an artificial neural network. We present an intuitive approach, named MINIMALIST, in which the optimal model of the likelihood-to-evidence ratio is found by maximizing the likelihood of simulated data. Within this framework, the connection between the task of simulation-based inference and mutual information maximization is clear, and we show how several known methods of posterior estimation relate to alternative lower bounds to mutual information. These distinct objective functions aim at the same optimal energy form and therefore can be directly benchmarked. We compare their accuracy in the inference of model parameters, focusing on four dynamical systems that encompass common challenges in time series analysis: dynamics driven by multiplicative noise, nonlinear interactions, chaotic behavior, and high-dimensional parameter space.
CITATION STYLE
Isacchini, G., Spisak, N., Nourmohammad, A., Mora, T., & Walczak, A. M. (2022). Mutual information maximization for amortized likelihood inference from sampled trajectories: MINIMALIST. Physical Review E, 105(5). https://doi.org/10.1103/PhysRevE.105.055309
Mendeley helps you to discover research relevant for your work.