Abstract
Recent years have witnessed great success of mobile short-form video apps. However, most current video streaming strategies are designed for long-form videos, which cannot be directly applied to short-form videos. Especially, short-form videos differ in many aspects, such as shorter video length, mobile friendliness, sharp popularity dynamics, and so on. Facing these challenges, in this paper, we perform an in-depth measurement study on Douyin, one of the most popular mobile short-form video platforms in China. The measurement study reveals that Douyin adopts a rather simple strategy (called Next-One strategy) based on HTTP progressive download, which uses a sliding window with stop-and-wait protocol. Such a strategy performs poorly when network connection is slow and user scrolling is fast. The results motivate us to design an intelligent adaptive streaming scheme for mobile short-form videos. We formulate the short-form video streaming problem and propose an adaptive short-form video streaming strategy called LiveClip using a deep reinforcement learning (DRL) approach. Trace-driven experimental results prove that LiveClip outperforms existing state-of-the-art approaches by around 10%-40% under various scenarios.
Author supplied keywords
Cite
CITATION STYLE
He, J., Hu, M., Zhou, Y., & Wu, D. (2020). LiveClip: Towards intelligent mobile short-form video streaming with deep reinforcement learning. In NOSSDAV 2020 - Proceedings of the 2020 Workshop on Network and Operating System Support for Digital Audio and Video, Part of MMSys 2020 (pp. 54–59). Association for Computing Machinery. https://doi.org/10.1145/3386290.3396937
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.