The evolution of drainage networks at large scales has been shown to follow a principle of minimization of the global rate of energy dissipation. This study was undertaken to evaluate whether a principle similar to that holds for rill networks at a much smaller scale. Simulated rainfall was applied to a 2 m by 4 m flume with varied initial slope (5% and 20%) and roughness (low, moderate, and great) conditions. The results indicated that assuming the validity of a local optimality principle, the rill networks evolved according to a global principal of energy optimization in situations where rilling was intense, 20% slope, but not at 5% slope where diffusive processes played a dominant role in the overall erosion process. These results suggest that the application of models similar to some used to explain the evolution of river networks may have a role in understanding the initiation and evolution of rill networks in situations of intensive rilling. Our results and analysis suggest that further experiments might be undertaken to study the spatial distribution of flow velocity within the rill networks at a given flow discharge and the local rate of energy dissipation at rill links. Despite the convergence toward similar values of some of the network characteristics, differences in the microrelief of the initial surfaces were translated into significant differences between the final rill networks.
CITATION STYLE
Gómez, J. A., Darboux, F., & Nearing, M. A. (2003). Development and evolution of rill networks under simulated rainfall. Water Resources Research, 39(6). https://doi.org/10.1029/2002WR001437
Mendeley helps you to discover research relevant for your work.