How entropic regression beats the outliers problem in nonlinear system identification

45Citations
Citations of this article
35Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this work, we developed a nonlinear System Identification (SID) method that we called Entropic Regression. Our method adopts an information-theoretic measure for the data-driven discovery of the underlying dynamics. Our method shows robustness toward noise and outliers, and it outperforms many of the current state-of-the-art methods. Moreover, the method of Entropic Regression overcomes many of the major limitations of the current methods such as sloppy parameters, diverse scale, and SID in high-dimensional systems such as complex networks. The use of information-theoretic measures in entropic regression has unique advantages, due to the Asymptotic Equipartition Property of probability distributions, that outliers and other low-occurrence events are conveniently and intrinsically de-emphasized as not-typical, by definition. We provide a numerical comparison with the current state-of-the-art methods in sparse regression, and we apply the methods to different chaotic systems such as the Lorenz System, the Kuramoto-Sivashinsky equations, and the Double-Well Potential.

Cite

CITATION STYLE

APA

Almomani, A. A. R., Sun, J., & Bollt, E. (2020). How entropic regression beats the outliers problem in nonlinear system identification. Chaos, 30(1). https://doi.org/10.1063/1.5133386

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free