Research of Data Compression Using Huffman Coding and Arithmetic Coding

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

When transmitting and storing pictures or information, we hope that the picture information can occupy less space without losing the original information. Entropy coding is a lossless data compression coding that can encode according to the frequency of elements without losing information. Common entropy codes include Shannon coding, Huffman coding, arithmetic coding and so on. This paper will make a comparative analysis of two common coding methods, Huffman coding and arithmetic coding. Considering the correlation between coded symbol sequences, the probability value of the symbol sequence is used to replace the smaller probability value of a single symbol. The principle is applied to binary arithmetic coding to form a more effective method than traditional Huffman coding, which can shorten the average code length and make the amount of information of the code approach the entropy rate of the symbol, so as to significantly improve the data compression ratio of binary arithmetic coding experimental tests on different types of data show that the compression coding effect is good. The experimental results demonstrate the efficiency of the optimization method described in this paper.

Cite

CITATION STYLE

APA

Zhu, X., Zhang, J., & Zhu, H. (2022). Research of Data Compression Using Huffman Coding and Arithmetic Coding. In Lecture Notes in Electrical Engineering (Vol. 961 LNEE, pp. 954–961). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-19-6901-0_98

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free