Gaussian Blur through Parallel Computing

10Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Two dimensional 2D convolution is one of the most complex calculations and memory intensive algorithms used in image processing. In our paper, we present the 2D convolution algorithm used in the Gaussian blur which is a filter widely used for noise reduction and has high computational requirements. Since, single threaded solutions cannot keep up with the performance and speed needed for image processing techniques. Therefore, parallelizing the image convolution on parallel systems enhances the performance and reduces the processing time. This paper aims to give an overview on the performance enhancement of the parallel systems on image convolution using Gaussian blur algorithm. We compare the speed up of the algorithm on two parallel systems: multi-core central processing unit CPU and graphics processing unit GPU using Google Colaboratory or “colab”.

Cite

CITATION STYLE

APA

Ibrahim, N. M., ElFarag, A. A., & Kadry, R. (2021). Gaussian Blur through Parallel Computing. In Proceedings of the International Conference on Image Processing and Vision Engineering, IMPROVE 2021 (pp. 175–179). SciTePress. https://doi.org/10.5220/0010513301750179

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free