Show simple item record

dc.contributor.authorRichter, Edward
dc.contributor.authorValancius, Spencer
dc.contributor.authorMcClanahan, Josiah
dc.contributor.authorMixter, John
dc.contributor.authorAkoglu, Ali
dc.date.accessioned2018-08-14T18:12:18Z
dc.date.available2018-08-14T18:12:18Z
dc.date.issued2018-07
dc.identifier.citationRichter, E., Valancius, S., McClanahan, J. et al. J Supercomput (2018) 74: 3211. https://doi.org/10.1007/s11227-018-2374-xen_US
dc.identifier.issn0920-8542
dc.identifier.issn1573-0484
dc.identifier.doi10.1007/s11227-018-2374-x
dc.identifier.urihttp://hdl.handle.net/10150/628514
dc.description.abstractArtificial neural networks (ANNs) have become a popular means of solving complex problems in prediction-based applications such as image and natural language processing. Two challenges prominent in the neural network domain are the practicality of hardware implementation and dynamically training the network. In this study, we address these challenges with a development methodology that balances the hardware footprint and the quality of the ANN. We use the well-known perceptron-based branch prediction problem as a case study for demonstrating this methodology. This problem is perfect to analyze dynamic hardware implementations of ANNs because it exists in hardware and trains dynamically. Using our hierarchical configuration search space exploration, we show that we can decrease the memory footprint of a standard perceptron-based branch predictor by 2.3 with only a 0.6% decrease in prediction accuracy.en_US
dc.description.sponsorshipRaytheon Missile Systems [2017-UNI-0008]en_US
dc.language.isoenen_US
dc.publisherSPRINGERen_US
dc.relation.urlhttp://link.springer.com/10.1007/s11227-018-2374-xen_US
dc.rights© Springer Science+Business Media, LLC, part of Springer Nature 2018.en_US
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/
dc.subjectArtificial neural networken_US
dc.subjectBranch predictionen_US
dc.subjectPerceptronen_US
dc.subjectSimpleScalaren_US
dc.titleBalancing the learning ability and memory demand of a perceptron-based dynamically trainable neural networken_US
dc.typeArticleen_US
dc.contributor.departmentUniv Arizona, Dept Elect & Comp Engnen_US
dc.identifier.journalJOURNAL OF SUPERCOMPUTINGen_US
dc.description.note12 month embargo; published online: 16 April 2018en_US
dc.description.collectioninformationThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.en_US
dc.eprint.versionFinal accepted manuscripten_US
dc.source.journaltitleThe Journal of Supercomputing
dc.source.volume74
dc.source.issue7
dc.source.beginpage3211
dc.source.endpage3235


Files in this item

Thumbnail
Name:
balancing-learning-ability.pdf
Size:
770.4Kb
Format:
PDF
Description:
Final Accepted Manuscript

This item appears in the following Collection(s)

Show simple item record