Hot temperature extremes have increased substantially in frequency and magnitude over past decades. A widely used approach to quantify this phenomenon is standardizing temperature data relative to the local mean and variability of a reference period. Here we demonstrate that this conventional procedure leads to exaggerated estimates of increasing temperature variability and extremes. For example, the occurrence of "two-sigma extremes" would be overestimated by 48.2% compared to a given reference period of 30 years with time-invariant simulated Gaussian data. This corresponds to an increase from a 2.0% to 2.9% probability of such events. We derive an analytical correction revealing that these artifacts prevail in recent studies. Our analyses lead to a revision of earlier reports: For instance, we show that there is no evidence for a recent increase in normalized temperature variability. In conclusion, we provide an analytical pathway to describe changes in variability and extremes in climate observations and model simulations. Key Points Conventional normalization of spatiotemporal data sets with respect to a reference period induces artifacts Normalization-induced artifacts are most severe if variability or extremes are under scrutiny The study provides an analytical correction and accurate estimate of variability and extremes.
CITATION STYLE
Sippel, S., Zscheischler, J., Heimann, M., Otto, F. E. L., Peters, J., & Mahecha, M. D. (2015). Quantifying changes in climate variability and extremes: Pitfalls and their overcoming. Geophysical Research Letters, 42(22), 9990–9998. https://doi.org/10.1002/2015GL066307
Mendeley helps you to discover research relevant for your work.