Detection of critical slowing down (CSD) is the dominant avenue for anticipating critical transitions from noisy time-series data. Most commonly, changes in variance and lag-1 autocorrelation [AC(1)] are used as CSD indicators. However, these indicators will only produce reliable results if the noise driving the system is white and stationary. In the more realistic case of time-correlated red noise, increasing (decreasing) the correlation of the noise will lead to spurious (masked) alarms for both variance and AC(1). Here, we propose two new methods that can discriminate true CSD from possible changes in the driving noise characteristics. We focus on estimating changes in the linear restoring rate based on Langevin-type dynamics driven by either white or red noise. We assess the capacity of our new estimators to anticipate critical transitions and show that they perform significantly better than other existing methods both for continuous-time and discrete-time models. In addition to conceptual models, we apply our methods to climate model simulations of the termination of the African Humid Period. The estimations rule out spurious signals stemming from nonstationary noise characteristics and reveal a destabilization of the African climate system as the dynamical mechanism underlying this archetype of abrupt climate change in the past.
CITATION STYLE
Morr, A., & Boers, N. (2024). Detection of Approaching Critical Transitions in Natural Systems Driven by Red Noise. Physical Review X, 14(2). https://doi.org/10.1103/PhysRevX.14.021037
Mendeley helps you to discover research relevant for your work.