Improved Bit Allocation Using Distortion for the CTU-Level Rate Control in HEVC

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The research on rate control has always been a hot topic in video coding. Although H.265/HEVC is the latest video coding standard, the mismatch between the number of target bits and that of actual coding bits in the coding process still exists. A large number of experimental results have shown that the R-D model is superior to the R-λ model in terms of the matching degree of bit allocation. In this paper, a novel bit allocation method is proposed. First, the recursive Taylor expansion equation solving method is used to solve the rate-distortion (constrained problem) optimization equation, and then the R-D model is used to optimize the CTU-level target bit allocation.The simulation results indicate that the average bitrate error is only 0.19%, while the mean peak signal-to-noise ratio (MPSNR) increases by 0.14 dB.

Cite

CITATION STYLE

APA

Lu, X., Zhou, B., Jin, X., & Gu, Y. (2019). Improved Bit Allocation Using Distortion for the CTU-Level Rate Control in HEVC. In Lecture Notes in Electrical Engineering (Vol. 515, pp. 292–301). Springer Verlag. https://doi.org/10.1007/978-981-13-6264-4_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free