Satellite Image Time Series (SITS) is a data set that includes satellite images across several years with a high acquisition rate. Radiometric normalization is a fundamental and important preprocessing method for remote sensing applications using SITS due to the radiometric distortion caused by noise between images. Normalizing the subject image based on the reference image is a general strategy when using traditional radiometric normalization methods to normalize multitemporal imagery (usually two or three scenes in different time phases). However, these methods are unsuitable for calibrating SITS because they cannot minimize the radiometric distortion between any pair of images in SITS. The existing relative radiometric normalization methods for SITS are based on linear assumptions, which cannot effectively reduce nonlinear radiometric distortion caused by continuously changing noise in SITS. To overcome this problem and obtain a more accurate SITS, we propose a nonlinear radiometric normalization model (NMAG) for SITS based on Artificial Neural Networks (ANN) and Greedy Algorithm (GA). In this method, GA is used to determine the correction order of SITS and calculate the error between the image to be corrected and normalized images, which avoids the selection of a single reference image. ANN is used to obtain the optimal solution of error function, which minimizes the radiometric distortion between different images in SITS. The SITS composed of 21 Landsat-8 images in Tianjin, China, from October 2017 to January 2019 was selected to test the method. We compared NMAG with other two contrast methods (Contrast Method 1 (CM1) and Contrast Method 2 (CM2)), and found that the average root mean square error (µRMSE) of NMAG (497.22) is significantly smaller than those of CM1 (641.39) and CM2 (543.47), and the accuracy of normalized SITS obtained using NMAG increases by 22.4% and 8.5% compared with CM1 and CM2, respectively. These experimental results confirm the effectiveness of NMAG in reducing radiometric distortion caused by continuously changing noise between images in SITS.
CITATION STYLE
Yin, Z., Zou, L., Sun, J., Zhang, H., Zhang, W., & Shen, X. (2021). A nonlinear radiometric normalization model for satellite imgaes time series based on artificial neural networks and greedy algroithm. Remote Sensing, 13(5), 1–15. https://doi.org/10.3390/rs13050933
Mendeley helps you to discover research relevant for your work.