Modeling of rainfall time series and extremes using bounded random cascades and Levy-stable distributions

125Citations
Citations of this article
73Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

A new model for simulation of rainfall time series is proposed. It is shown that both the intensity and duration of individual rainfall events can be best modeled by a 'fat-tailed' Levy-stable distribution. The temporal downscaling of individual events is produced by a new type of a bounded random cascade model. The proposed rainfall model is shown to successfully reproduce the statistical behavior of individual storms as well as, and in particular, the statistical behavior of annual maxima. In contrast, a model based on a gamma distribution for rainfall intensity substantially underestimates the absolute values of extreme events and does not correctly reproduce their scaling behavior. Similarly, a model based on self-similar random cascade (as opposed to the bounded cascade) substantially overestimates the extreme events.

Cite

CITATION STYLE

APA

Menabde, M., & Sivapalan, M. (2000). Modeling of rainfall time series and extremes using bounded random cascades and Levy-stable distributions. Water Resources Research, 36(11), 3293–3300. https://doi.org/10.1029/2000WR900197

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free