3D insights to some divergences for robust statistics and machine learning

6Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Divergences (distances) which measure the similarity respectively proximity between two probability distributions have turned out to be very useful for several different tasks in statistics, machine learning, information theory, etc. Some prominent examples are the Kullback-Leibler information, – for convex functions φ – the Csiszar-Ali-Silvey φ - divergences CASD, the “classical” (i.e., unscaled) Bregman distances and the more general scaled Bregman distances SBD of [26, 27]. By means of 3D plots we show several properties and pitfalls of the geometries of SBDs, also for non-probability distributions; robustness of corresponding minimum-distance-concepts will also be covered. For these investigations, we construct a special SBD subclass which covers both the often used power divergences (of CASD type) as well as their robustness-enhanced extensions with non-convex non-concave φ.

Cite

CITATION STYLE

APA

Roensch, B., & Stummer, W. (2017). 3D insights to some divergences for robust statistics and machine learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10589 LNCS, pp. 460–469). Springer Verlag. https://doi.org/10.1007/978-3-319-68445-1_54

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free