Recently, the issue of noise reduction in chaotic hydrologic time series has started to attract attention. In this paper, the concept of noise reduction and the utility of its application to hydrologic time series are revisited based on a nonlinear noise reduction algorithm that is found to be different from the algorithms discussed earlier in hydrologic literature. First, the existence of chaotic behaviour in the time series is investigated. Second, the concepts of noise, its effect and noise reduction are briefly discussed. Third, two nonlinear noise reduction methods are explained and applied to the daily data of the English River in Ontario to study the effect of noise reduction on the improvement of the accuracy of modelling the hydrologic time series. The process of estimating missing data is selected as a common hydrologic problem. It is found that the nonlinear noise reduction algorithms either remove a significant part of the original signal or have an insignificant effect on the accuracy of modelling the time series. It is recommended that the raw data should always be the basis for analysis of the time series. © 2001 Taylor & Francis Group, LLC.
CITATION STYLE
Elshorbagy, A. (2001). Noise reduction approach in chaotic hydrologic time series revisited. Canadian Water Resources Journal, 26(4), 537–550. https://doi.org/10.4296/cwrj2604537
Mendeley helps you to discover research relevant for your work.