Divergence Measure of Belief Function and Its Application in Data Fusion

63Citations
Citations of this article
15Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Divergence measure is widely used in many applications. To efficiently deal with uncertainty in real applications, basic probability assignment (BPA) in Dempster-Shafer evidence theory, instead of probability distribution, is adopted. As a result, an open issue is that how to measure the divergence of BPA. In this paper, a new divergence measure of two BPAs is proposed. The proposed divergence measure is the generalization of Kullback-Leibler divergence since when the BPA is degenerated as probability distribution, the proposed belief divergence is equal to Kullback-Leibler divergence. Furthermore, compared with existing belief divergence measure, the new method has a better performance under the situation with a great degree of uncertainty and ambiguity. Numerical examples are used to illustrate the efficiency of the proposed divergence measure. In addition, based on the proposed belief divergence measure, a combination model is proposed to address data fusion. Finally, an example in target recognition is shown to illustrate the advantage of the new belief divergence in handling not only extreme uncertainty, but also highly conflicting data.

Cite

CITATION STYLE

APA

Song, Y., & Deng, Y. (2019). Divergence Measure of Belief Function and Its Application in Data Fusion. IEEE Access, 7, 107465–107472. https://doi.org/10.1109/ACCESS.2019.2932390

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free