Abstract
From AI-assisted art creation to large language model (LLM)-powered ChatGPT, AI-generated contents and services are becoming a transforming force. It calls for the telecom industry to embrace the prospects of AIGC services and face the unique challenges posed by incorporating generative model services into the AI-native 6G wireless network paradigm. We propose enabling AIGC inference services on mobile devices by optimizing MEC-device computing offloading, through which AIGC task latency is minimized by reinforcement learning based policy agent in a computing resource constrained and bandwidth limited wireless environment. Simulation results are presented to demonstrate the performance advantage.
Author supplied keywords
Cite
CITATION STYLE
Zhou, C., Liu, W., Han, T., & Ansari, N. (2024). Deploying On-Device AIGC Inference Services in 6G via Optimal MEC-Device Offloading. IEEE Networking Letters, 6(4), 232–236. https://doi.org/10.1109/LNET.2024.3490954
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.