Generalized Fisher Kernel with Bregman Divergence

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Fisher kernel has good statistical properties. However, from a practical point of view, the necessary distributional assumptions complicate the applicability. We approach the solution to this problem with the NMF (Non-negative Matrix Factorization) methods, which with adequate normalization conditions, provide stochastic matrices. Using the Bregman divergence as the objective function, formally equivalent solutions appear for the specific forms of the functionals involved. We show that simply by taking these results and plug-in into the general expression of the NMF kernel, obtained with purely algebraic techniques, without any assumptions about the distribution of the parameters, the properties of the Fisher kernel hold, and it is a convenient procedure to use this kernel the situations in which they are needed we derive the expression of the information matrix of Fisher. In this work, we have limited the study to the Gaussian metrics, KL (Kullback-Leibler), and I-divergence.

Cite

CITATION STYLE

APA

Figuera, P., Cuzzocrea, A., & Bringas, P. G. (2022). Generalized Fisher Kernel with Bregman Divergence. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13469 LNAI, pp. 186–194). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-15471-3_17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free