When transmitting and storing pictures or information, we hope that the picture information can occupy less space without losing the original information. Entropy coding is a lossless data compression coding that can encode according to the frequency of elements without losing information. Common entropy codes include Shannon coding, Huffman coding, arithmetic coding and so on. This paper will make a comparative analysis of two common coding methods, Huffman coding and arithmetic coding. Considering the correlation between coded symbol sequences, the probability value of the symbol sequence is used to replace the smaller probability value of a single symbol. The principle is applied to binary arithmetic coding to form a more effective method than traditional Huffman coding, which can shorten the average code length and make the amount of information of the code approach the entropy rate of the symbol, so as to significantly improve the data compression ratio of binary arithmetic coding experimental tests on different types of data show that the compression coding effect is good. The experimental results demonstrate the efficiency of the optimization method described in this paper.
CITATION STYLE
Zhu, X., Zhang, J., & Zhu, H. (2022). Research of Data Compression Using Huffman Coding and Arithmetic Coding. In Lecture Notes in Electrical Engineering (Vol. 961 LNEE, pp. 954–961). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-19-6901-0_98
Mendeley helps you to discover research relevant for your work.