Distributed vector quantization based on Kullback-Leibler divergence

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

The goal of vector quantization is to use a few reproduction vectors to represent original vectors/data while maintaining the necessary fidelity of the data. Distributed signal processing has received much attention in recent years, since in many applications data are dispersedly collected/stored in distributed nodes over networks, but centralizing all these data to one processing center is sometimes impractical. In this paper, we develop a distributed vector quantization (VQ) algorithm based on Kullback-Leibler (K-L) divergence. We start from the centralized case and propose to minimize the K-L divergence between the distribution of global original data and the distribution of global reproduction vectors, and then obtain an online iterative solution to this optimization problem based on the Robbins-Monro stochastic approximation. Afterwards, we extend the solution to apply to distributed cases by introducing diffusion cooperation among nodes. Numerical simulations show that the performances of the distributed K-L-based VQ algorithm are very close to the corresponding centralized algorithm. Besides, both the centralized and distributed K-L-based VQ show more robustness to outliers than the (centralized) Linde-Buzo-Gray (LBG) algorithm and the (centralized) self-organization map (SOM) algorithm.

Cite

CITATION STYLE

APA

Shen, P., Li, C., & Luo, Y. (2015). Distributed vector quantization based on Kullback-Leibler divergence. Entropy, 17(12), 7875–7887. https://doi.org/10.3390/e17127851

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free