A novel definition of robustness for image processing algorithms

6Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

As image gains much wider importance in our society, image processing has found various applications since the 60‘s: biomedical imagery, security and many more. A highly common issue in those processes is the presence of an uncontrolled and destructive perturbation generally referred to “noise”. The ability of an algorithm to resist to this noise has been referred to as “robustness”; but this notion has never been clearly defined for image processing techniques. A wide bibliographic study showed that this term “robustness” is largely mixed up with others as efficiency, quality, etc., leading to a disturbing confusion. In this article, we propose a completely new framework to define the robustness of image processing algorithms, by considering multiple scales of additive noise. We show the relevance of our proposition by evaluating and by comparing the robustness of recent and more classic algorithms designed to two tasks: still image denoising and background subtraction in videos.

Cite

CITATION STYLE

APA

Vacavant, A. (2017). A novel definition of robustness for image processing algorithms. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10214 LNCS, pp. 75–87). Springer Verlag. https://doi.org/10.1007/978-3-319-56414-2_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free