Vehicular edge computing (VEC) is emerging as a new computing paradigm to improve the quality of vehicular services and enhance the capabilities of vehicles. It enables performing tasks with low latency by deploying computing and storage resources close to vehicles. However, the traditional task offloading schemes only focus on one-shot offloading, taking less into consideration task dependency. Furthermore, the continuous action space problem during task offloading should be considered. In this paper, an efficient dependency-aware task offloading scheme for VEC with vehicle-edge-cloud collaborative computation is proposed, where subtasks can be processed locally or can be offloaded to an edge server, or a cloud server for execution. Specifically, first, the directed acyclic graph (DAG) is utilized to model the dependency of subtasks. Second, a task offloading algorithm based on Deep Deterministic Policy Gradient (DDPG) was proposed to obtain the optimal offloading strategy in a vehicle-edge-cloud environment, which efficiently solves the continuous control problem and helps reach fast convergence. Finally, extensive simulation experiments have been conducted, and the experimental results show that the proposed scheme can improve performance by about 13.62% on average against three baselines.
CITATION STYLE
Liu, G., Dai, F., Huang, B., Qiang, Z., Wang, S., & Li, L. (2022). A collaborative computation and dependency-aware task offloading method for vehicular edge computing: a reinforcement learning approach. Journal of Cloud Computing, 11(1). https://doi.org/10.1186/s13677-022-00340-3
Mendeley helps you to discover research relevant for your work.