Delta Divergence: A Novel Decision Cognizant Measure of Classifier Incongruence

9Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In pattern recognition, disagreement between two classifiers regarding the predicted class membership of an observation can be indicative of an anomaly and its nuance. Since, in general, classifiers base their decisions on class a posteriori probabilities, the most natural approach to detecting classifier incongruence is to use divergence. However, existing divergences are not particularly suitable to gauge classifier incongruence. In this paper, we postulate the properties that a divergence measure should satisfy and propose a novel divergence measure, referred to as delta divergence. In contrast to existing measures, it focuses on the dominant (most probable) hypotheses and, thus, reduces the effect of the probability mass distributed over the non dominant hypotheses (clutter). The proposed measure satisfies other important properties, such as symmetry, and independence of classifier confidence. The relationship of the proposed divergence to some baseline measures, and its superiority, is shown experimentally.

Cite

CITATION STYLE

APA

Kittler, J., & Zor, C. (2019). Delta Divergence: A Novel Decision Cognizant Measure of Classifier Incongruence. IEEE Transactions on Cybernetics, 49(6), 2331–2343. https://doi.org/10.1109/TCYB.2018.2825353

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free