Information theoretical measures for achieving robust learning machines

1Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine.

References Powered by Scopus

Gradient-based learning applied to document recognition

44103Citations
N/AReaders
Get full text

A Mathematical Theory of Communication

37111Citations
N/AReaders
Get full text

Elements of Information Theory

36604Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Instance selection for classifier performance estimation in meta learning

15Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Zegers, P., Frieden, B. R., Alarcón, C., & Fuentes, A. (2016). Information theoretical measures for achieving robust learning machines. Entropy, 18(8). https://doi.org/10.3390/e18080295

Readers' Seniority

Tooltip

Professor / Associate Prof. 2

33%

PhD / Post grad / Masters / Doc 2

33%

Lecturer / Post doc 1

17%

Researcher 1

17%

Readers' Discipline

Tooltip

Engineering 5

71%

Computer Science 1

14%

Social Sciences 1

14%

Save time finding and organizing research with Mendeley

Sign up for free