Design of A new Algorithm by Using Standard Deviation Techniques in Multi Edge Computing with IoT Application

4Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

The Internet of Things (IoT) requires a new processing model that will allow scalability in cloud computing while reducing time delay caused by data transmission within a network. Such a model can be achieved by using resources that are closer to the user, i.e., by relying on edge computing (EC). The amount of IoT data also grows with an increase in the number of IoT devices. However, building such a flexible model within a heterogeneous environment is difficult in terms of resources. Moreover, the increasing demand for IoT services necessitates shortening time delay and response time by achieving effective load balancing. IoT devices are expected to generate huge amounts of data within a short amount of time. They will be dynamically deployed, and IoT services will be provided to EC devices or cloud servers to minimize resource costs while meeting the latency and quality of service (QoS) constraints of IoT applications when IoT devices are at the endpoint. EC is an emerging solution to the data processing problem in IoT. In this study, we improve the load balancing process and distribute resources fairly to tasks, which, in turn, will improve QoS in cloud and reduce processing time, and consequently, response time.

Cite

CITATION STYLE

APA

Almashhadani, H. A., Deng, X., Al-Hwaidi, O. R., Abdul-Samad, S. T., Ibrahm, M. M., & Latif, S. N. A. (2023). Design of A new Algorithm by Using Standard Deviation Techniques in Multi Edge Computing with IoT Application. KSII Transactions on Internet and Information Systems, 17(4), 1147–1161. https://doi.org/10.3837/tiis.2023.04.006

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free