Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine.
CITATION STYLE
Zegers, P., Frieden, B. R., Alarcón, C., & Fuentes, A. (2016). Information theoretical measures for achieving robust learning machines. Entropy, 18(8). https://doi.org/10.3390/e18080295
Mendeley helps you to discover research relevant for your work.