Traditional design of active queue management (AQM) assumes a fixed model at an operating point and lacks planning. AQMs like controlled delay provide some planning but are insensitive to dynamic traffic. In this Letter, the authors propose dynamic traffic aware AQM with deep reinforcement learning to expand the model on a grid of operating points and to add more planning. By learning the network parameter on the grid and the transitions between operating points with collected data, the fluid model of transmission control protocol dynamics is obtained. Planning on operating points is enhanced by value iterations. Evaluations show that the algorithm decreases queuing delay for dynamic traffic with more planning..
CITATION STYLE
Jin, W., Gu, R., Ji, Y., Dong, T., Yin, J., & Liu, Z. (2019). Dynamic traffic aware active queue management using deep reinforcement learning. Electronics Letters, 55(20), 1084–1086. https://doi.org/10.1049/el.2019.1146
Mendeley helps you to discover research relevant for your work.