Information Theoretical Measures for Achieving Robust Learning Machines
Affiliation
Univ Arizona, Coll Opt SciIssue Date
2016-08-12Keywords
information theoretical learningShannon entropy
Kullback-Leibler divergence
relative entropy
cross-entropy
Fisher information
relative information
Metadata
Show full item recordPublisher
MDPI AGCitation
Information Theoretical Measures for Achieving Robust Learning Machines 2016, 18 (8):295 EntropyJournal
EntropyRights
Copyright © 2016 by the authors; licensee MDPI, Basel, Switzerland. This is an open access article distributed under the Creative Commons Attribution License (CC BY 4.0).Collection Information
© 2016 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC-BY) license (http://creativecommons.org/licenses/by/4.0/).Abstract
Information theoretical measures are used to design, from first principles, an objective function that can drive a learning machine process to a solution that is robust to perturbations in parameters. Full analytic derivations are given and tested with computational examples showing that indeed the procedure is successful. The final solution, implemented by a robust learning machine, expresses a balance between Shannon differential entropy and Fisher information. This is also surprising in being an analytical relation, given the purely numerical operations of the learning machine.ISSN
1099-4300Version
Final published versionSponsors
CONICYT Chile [FONDECYT 1120680]Additional Links
http://www.mdpi.com/1099-4300/18/8/295ae974a485f413a2113503eed53cd6c53
10.3390/e18080295
Scopus Count
Collections
Except where otherwise noted, this item's license is described as Copyright © 2016 by the authors; licensee MDPI, Basel, Switzerland. This is an open access article distributed under the Creative Commons Attribution License (CC BY 4.0).