Measuring the discrepancy between conditional distributions: Methods, properties and applications

12Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

Abstract

We propose a simple yet powerful test statistic to quantify the discrepancy between two conditional distributions. The new statistic avoids the explicit estimation of the underlying distributions in high-dimensional space and it operates on the cone of symmetric positive semidefinite (SPS) matrix using the Bregman matrix divergence. Moreover, it inherits the merits of the correntropy function to explicitly incorporate high-order statistics in the data. We present the properties of our new statistic and illustrate its connections to prior art. We finally show the applications of our new statistic on three different machine learning problems, namely the multi-task learning over graphs, the concept drift detection, and the information-theoretic feature selection, to demonstrate its utility and advantage. Code of our statistic is available at https://bit.ly/BregmanCorrentropy.

Cite

CITATION STYLE

APA

Yu, S., Shaker, A., Alesiani, F., & Principe, J. (2020). Measuring the discrepancy between conditional distributions: Methods, properties and applications. In IJCAI International Joint Conference on Artificial Intelligence (Vol. 2021-January, pp. 2777–2784). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2020/385

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free