A TensorFlow implementation of Local Binary Patterns Transform

  • AKGÜN D
N/ACitations
Citations of this article
11Readers
Mendeley users who have this article in their library.

Abstract

Feature extraction layers like Local Binary Patterns (LBP) transform can be very useful for improving the accuracy of machine learning and deep learning models depending on the problem type. Direct implementations of such layers in Python may result in long running times, and training a computer vision model may be delayed significantly. For this purpose, TensorFlow framework enables developing accelerated custom operations based on the existing operations which already have support for accelerated hardware such as multicore CPU and GPU. In this study, LBP transform which is used for feature extraction in various applications, was implemented based on TensorFlow operations. The evaluations were done using both standard Python operations and TensorFlow library for performance comparisons. The experiments were realized using images in various dimensions and various batch sizes. Numerical results show that algorithm based on TensorFlow operations provides good acceleration rates over Python runs. The implementation of LBP can be used for the accelerated computing for various feature extraction purposes including machine learning as well as in deep learning applications.

Cite

CITATION STYLE

APA

AKGÜN, D. (2021). A TensorFlow implementation of Local Binary Patterns Transform. MANAS Journal of Engineering, 9(1), 15–21. https://doi.org/10.51354/mjen.822630

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free