Information Theoretical Measures for Achieving Robust Learning Machines
AffiliationUniv Arizona, Coll Opt Sci
Keywordsinformation theoretical learning
MetadataShow full item record
CitationInformation Theoretical Measures for Achieving Robust Learning Machines 2016, 18 (8):295 Entropy
RightsThis is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
Collection Information© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).
AbstractInformation theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine.
VersionFinal published version
SponsorsCONICYT Chile [FONDECYT 1120680]