Revisiting online scheduling for AI-driven internet of things

0Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

AI-driven Internet of Things (IoT) use AI Inference to characterize data processed from various sensors. Together, AI and IoT support smart buildings, cities, cars and drones. However, AI Inference requires updates when executed in new contexts. Frequent updates consume more energy and drain precious IoT batteries. Updates can be batched together to save energy, but it is challenging to batch updates well without knowing when updates will arrive, what their processing needs will be and how long they can be delayed. This work studies update batching and its potential energy savings. We define an update batching policy as a sequence of discrete choices about when to apply concurrent updates. This allows us to use random walks to sample update batching policies. Random walks simulate nearly 1M batching policies and models their energy footprint for an AI-driven IoT comprising 50 AI Inference components. The best policy uses much less energy than 99th and 95th percentiles. First-come-first-serve and Shortest-job-first policies perform like the median sampled batching policy, using 7X more energy.

Cite

CITATION STYLE

APA

Babu, N. T. R., & Stewart, C. (2019). Revisiting online scheduling for AI-driven internet of things. In Proceedings of the 4th ACM/IEEE Symposium on Edge Computing, SEC 2019 (pp. 310–312). Association for Computing Machinery, Inc. https://doi.org/10.1145/3318216.3363326

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free