The UA Campus Repository is experiencing systematic automated, high-volume traffic (bots). Temporary mitigation measures to address bot traffic have been put in place; however, this has resulted in restrictions on searching WITHIN collections or using sidebar filters WITHIN collections. You can still Browse by Title/Author/Year WITHIN collections. Also, you can still search at the top level of the repository (use the search box at the top of every page) and apply filters from that search level. Export of search results has also been restricted at this time. Please contact us at any time for assistance - email repository@u.library.arizona.edu.

Show simple item record

dc.contributor.advisorAndrews, Gregory R.en_US
dc.contributor.advisorHartman, John H.en_US
dc.contributor.authorPerianayagam, Somasundaram
dc.creatorPerianayagam, Somasundaramen_US
dc.date.accessioned2011-10-14T16:21:38Z
dc.date.available2011-10-14T16:21:38Z
dc.date.issued2011
dc.identifier.urihttp://hdl.handle.net/10150/145386
dc.description.abstractComputational science software experiments are hard to reproduce because external data sets could have changed, software used in the original experiment cannot be reconstructed, or the input parameters for an experiment may not be documented. We have developed a set of tools called Rex to aid in reproducing software experiments. They enable one to record an experiment and archive its apparatus, replay experiments, run new experiments on a recorded apparatus, and compare two recorded experiments. Rex can handle sequential, multiprocess, and multithreaded programs. It does not require any modification to applications or the operating system on which they execute. The implementation of the Rex tools is based on being able to trap and compare the system calls made by an experiment. This dissertation discusses the challenges in reproducing software experiments and describes Rex's design and implementation. It also evaluates the execution and space overhead of the Rex tools.
dc.language.isoenen_US
dc.publisherThe University of Arizona.en_US
dc.rightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.en_US
dc.titleReproducing Software Experimentsen_US
dc.typeElectronic Dissertationen_US
dc.typetexten_US
dc.identifier.oclc752261301
thesis.degree.grantorUniversity of Arizonaen_US
thesis.degree.leveldoctoralen_US
dc.contributor.committeememberDebray, Saumya K.en_US
dc.contributor.committeememberGniady, Christopheren_US
dc.identifier.proquest11428
thesis.degree.disciplineGraduate Collegeen_US
thesis.degree.disciplineComputer Scienceen_US
thesis.degree.namePh.D.en_US
refterms.dateFOA2018-06-23T23:37:44Z
html.description.abstractComputational science software experiments are hard to reproduce because external data sets could have changed, software used in the original experiment cannot be reconstructed, or the input parameters for an experiment may not be documented. We have developed a set of tools called Rex to aid in reproducing software experiments. They enable one to record an experiment and archive its apparatus, replay experiments, run new experiments on a recorded apparatus, and compare two recorded experiments. Rex can handle sequential, multiprocess, and multithreaded programs. It does not require any modification to applications or the operating system on which they execute. The implementation of the Rex tools is based on being able to trap and compare the system calls made by an experiment. This dissertation discusses the challenges in reproducing software experiments and describes Rex's design and implementation. It also evaluates the execution and space overhead of the Rex tools.


Files in this item

Thumbnail
Name:
azu_etd_11428_sip1_m.pdf
Size:
3.495Mb
Format:
PDF

This item appears in the following Collection(s)

Show simple item record