Backscatter-Aided Hybrid Data Offloading for Mobile Edge Computing via Deep Reinforcement Learning

11Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Data offloading in mobile edge computing (MEC) allows the low power IoT devices in the edge to optionally offload power-consuming computation tasks to MEC servers. In this paper, we consider a novel backscatter-aided hybrid data offloading scheme to further reduce the power consumption in data transmission. In particular, each device has a dual-mode radio that can offload data via either the conventional active RF communications or the passive backscatter communications with extreme low power consumption. The flexibility in the radio mode switching makes it more complicated to design the optimal offloading strategy, especially in a dynamic network with time-varying workload and energy supply at each device. Hence, we propose the deep reinforcement learning (DRL) framework to handle huge state space under uncertain network state information. By a simple quantization scheme, we design the learning policy in the Double Deep Q-Network (DDQN) framework, which is shown to have better stability and convergence properties. The numerical results demonstrate that the proposed DRL approach can learn and converge to the maximal energy efficiency compared with other baseline approaches.

Cite

CITATION STYLE

APA

Xie, Y., Xu, Z., Xu, J., Gong, S., & Wang, Y. (2019). Backscatter-Aided Hybrid Data Offloading for Mobile Edge Computing via Deep Reinforcement Learning. In Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST (Vol. 294 LNCIST, pp. 525–537). Springer. https://doi.org/10.1007/978-3-030-32388-2_45

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free