Deep Neural Network Inference via Edge Computing: On-Demand Accelerating

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.

Abstract

Deep Neural Networks (DNN) are a vital technology for allowing Artificial Intelligence applications in the 5G future, and they've gotten a lot of press. Complex DNN-based activities are difficult to run on mobile devices. Edge computing was introduced in this research as a solution to these problems. Edge makes use of two design features: DNN partitioning and DNN right-sizing. The training technique provides information about the training. Preserving Edge is a very dynamic filtering approach for video images. Filters for edge preservation are vital tools for the many tasks involved in image processing and transformation. Nonlinear algorithms calculated the filtered grey value according to the contents of a certain neighborhood. Only for the average pixels evaluated with the same grey values are these taken on the basis of the list of neighborhood pixels. While one of their common features is the conservation of the rim, each edge preserving filter is characterized by its own individual algorithm.

Author supplied keywords

Cite

CITATION STYLE

APA

Singh, M. K., Karthik, M., Ramesh, P., & Rama Naidu, G. (2023). Deep Neural Network Inference via Edge Computing: On-Demand Accelerating. In Advances in Transdisciplinary Engineering (Vol. 32, pp. 557–562). IOS Press BV. https://doi.org/10.3233/ATDE221312

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free