Information-Theoretic Methods in Deep Neural Networks: Recent Advances and Emerging Opportunities

20Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a review on the recent advances and emerging opportunities around the theme of analyzing deep neural networks (DNNs) with information-theoretic methods. We first discuss popular information-theoretic quantities and their estimators. We then introduce recent developments on information-theoretic learning principles (e.g., loss functions, regularizers and objectives) and their parameterization with DNNs. We finally briefly review current usages of information-theoretic concepts in a few modern machine learning problems and list a few emerging opportunities.

Cite

CITATION STYLE

APA

Yu, S., Giraldo, L. S., & Principe, J. (2021). Information-Theoretic Methods in Deep Neural Networks: Recent Advances and Emerging Opportunities. In IJCAI International Joint Conference on Artificial Intelligence (pp. 4669–4678). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2021/633

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free