Show simple item record

dc.contributor.advisorYakowitz, Sidneyen_US
dc.contributor.authorRUTHERFORD, BRIAN MILNE.
dc.creatorRUTHERFORD, BRIAN MILNE.en_US
dc.date.accessioned2011-10-31T16:53:50Z
dc.date.available2011-10-31T16:53:50Z
dc.date.issued1986en_US
dc.identifier.urihttp://hdl.handle.net/10150/183923
dc.description.abstractThe problem considered relates to estimating an arbitrary regression function m(x) from sample pairs (Xᵢ,Yᵢ) 1 ≤ i ≤ n. A model is assumed of the form Y = m(x) + ε(x) where ε(x) is a random variable with expectation 0. One well known method for estimating m(x) is by using one of a class of kernel regression estimators say m(n)(x). Schuster (1972) has shown conditions under which the limiting distribution of the kernel estimator m(n)(x) is the normal distribution. It might also be of interest to use the data to estimate the distribution of m(n)(x). One could, given this estimate, construct approximate confidence bounds for the function m(x). Three estimators are proposed for the density of m(n)(x). They share a basis in non-parametric kernel regression and utilize bootstrap techniques to obtain the density estimate. The order of convergence of one of the estimators is examined and conditions are given under which the order is higher then when estimation is by the normal approximation. Finally the performance of each estimator for constructing confidence bounds is compared for moderate sample sizes using computer studies.
dc.language.isoenen_US
dc.publisherThe University of Arizona.en_US
dc.rightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.en_US
dc.subjectRegression analysis.en_US
dc.subjectNonparametric statistics.en_US
dc.titleBOOTSTRAP AND RELATED METHODS FOR APPROXIMATE CONFIDENCE BOUNDS IN NONPARAMETRIC REGRESSION.en_US
dc.typetexten_US
dc.typeDissertation-Reproduction (electronic)en_US
dc.identifier.oclc697840213en_US
thesis.degree.grantorUniversity of Arizonaen_US
thesis.degree.leveldoctoralen_US
dc.identifier.proquest8702351en_US
thesis.degree.disciplineSystems and Industrial Engineeringen_US
thesis.degree.disciplineGraduate Collegeen_US
thesis.degree.namePh.D.en_US
refterms.dateFOA2018-06-15T19:19:28Z
html.description.abstractThe problem considered relates to estimating an arbitrary regression function m(x) from sample pairs (Xᵢ,Yᵢ) 1 ≤ i ≤ n. A model is assumed of the form Y = m(x) + ε(x) where ε(x) is a random variable with expectation 0. One well known method for estimating m(x) is by using one of a class of kernel regression estimators say m(n)(x). Schuster (1972) has shown conditions under which the limiting distribution of the kernel estimator m(n)(x) is the normal distribution. It might also be of interest to use the data to estimate the distribution of m(n)(x). One could, given this estimate, construct approximate confidence bounds for the function m(x). Three estimators are proposed for the density of m(n)(x). They share a basis in non-parametric kernel regression and utilize bootstrap techniques to obtain the density estimate. The order of convergence of one of the estimators is examined and conditions are given under which the order is higher then when estimation is by the normal approximation. Finally the performance of each estimator for constructing confidence bounds is compared for moderate sample sizes using computer studies.


Files in this item

Thumbnail
Name:
azu_td_8702351_sip1_m.pdf
Size:
2.509Mb
Format:
PDF
Description:
azu_td_8702351_sip1_m.pdf

This item appears in the following Collection(s)

Show simple item record