A Residual Dense U-Net Neural Network for Image Denoising

124Citations
Citations of this article
62Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In recent years, convolutional neural networks have achieved considerable success in different computer vision tasks, including image denoising. In this work, we present a residual dense neural network (RDUNet) for image denoising based on the densely connected hierarchical network. The encoding and decoding layers of the RDUNet consist of densely connected convolutional layers to reuse the feature maps and local residual learning to avoid the vanishing gradient problem and speed up the learning process. Moreover, global residual learning is adopted such that, instead of directly predicting the denoised image, the model predicts the residual noise of the corrupted image. The algorithm was trained for the case of additive white Gaussian noise and using a wide range of noise levels. Hence, one advantage of the proposal is that the denoising process does not require prior knowledge about the noise level. In order to evaluate the model, we conducted several experiments with natural image databases available online, achieving competitive results compared with state-of-the-art networks for image denoising. For comparison purpose, we use additive Gaussian noise with levels 10, 30, 50. In the case of grayscale images, we achieved PSNR of 34.39, 29.11, 26.99, and SSIM of 0.9297, 0.8193, 0.7491. For color images we obtained PSNR of 36.68, 31.43, 29.12, and SSIM of 0.9600, 0.8961, 0.8465.

Cite

CITATION STYLE

APA

Gurrola-Ramos, J., Dalmau, O., & Alarcón, T. E. (2021). A Residual Dense U-Net Neural Network for Image Denoising. IEEE Access, 9, 31742–31754. https://doi.org/10.1109/ACCESS.2021.3061062

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free