A Machine Learning Approach to Solve the Network Overload Problem Caused by IoT Devices Spatially Tracked Indoors

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

Currently, there are billions of connected devices, and the Internet of Things (IoT) has boosted these numbers. In the case of private networks, a few hundred devices connected can cause instability and even data loss in communication. In this article, we propose a machine learning-based modeling to solve network overload caused by continuous monitoring of the trajectories of several devices tracked indoors. The proposed modeling was evaluated with over a hundred thousand of coordinate locations of objects tracked in three synthetic environments and one real environment. It has been shown that it is possible to solve the network overload problem by increasing the latency in sending data and predicting intermediate coordinates of the trajectories on the server-side with ensemble models, such as Random Forest, and using Artificial Neural Networks without relevant data loss. It has also been shown that it is possible to predict at least thirty intermediate coordinates of the trajectories of objects tracked with R2 greater than 0.8.

Cite

CITATION STYLE

APA

Carvalho, D., Sullivan, D., Almeida, R., & Caminha, C. (2022). A Machine Learning Approach to Solve the Network Overload Problem Caused by IoT Devices Spatially Tracked Indoors. Journal of Sensor and Actuator Networks, 11(2). https://doi.org/10.3390/jsan11020029

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free