A GPU-accelerated real-time NLMeans algorithm for denoising color video sequences

32Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The NLMeans filter, originally proposed by Buades et al., is a very popular filter for the removal of white Gaussian noise, due to its simplicity and excellent performance. The strength of this filter lies in exploiting the repetitive character of structures in images. However, to fully take advantage of the repetitivity a computationally extensive search for similar candidate blocks is indispensable. In previous work, we presented a number of algorithmic acceleration techniques for the NLMeans filter for still grayscale images. In this paper, we go one step further and incorporate both temporal information and color information into the NLMeans algorithm, in order to restore video sequences. Starting from our algorithmic acceleration techniques, we investigate how the NLMeans algorithm can be easily mapped onto recent parallel computing architectures. In particular, we consider the graphical processing unit (GPU), which is available on most recent computers. Our developments lead to a high-quality denoising filter that can process DVD-resolution video sequences in real-time on a mid-range GPU. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Goossens, B., Luong, H., Aelterman, J., Pižurica, A., & Philips, W. (2010). A GPU-accelerated real-time NLMeans algorithm for denoising color video sequences. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6475 LNCS, pp. 46–57). https://doi.org/10.1007/978-3-642-17691-3_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free