Show simple item record

dc.contributor.advisorFurfaro, Robertoen
dc.contributor.authorLaw, Andrew M.
dc.creatorLaw, Andrew M.en
dc.date.accessioned2016-01-15T19:32:55Zen
dc.date.available2016-01-15T19:32:55Zen
dc.date.issued2015en
dc.identifier.urihttp://hdl.handle.net/10150/593622en
dc.description.abstractTo perform close proximity operations under a low-gravity environment, relative and absolute positions are vital information to the maneuver. Hence navigation is inseparably integrated in space travel. Extreme Learning Machine (ELM) is presented as an optical navigation method around small celestial bodies. Optical Navigation uses visual observation instruments such as a camera to acquire useful data and determine spacecraft position. The required input data for operation is merely a single image strip and a nadir image. ELM is a machine learning Single Layer feed-Forward Network (SLFN), a type of neural network (NN). The algorithm is developed on the predicate that input weights and biases can be randomly assigned and does not require back-propagation. The learned model is the output layer weights which are used to calculate a prediction. Together, Extreme Learning Machine Optical Navigation (ELM OpNav) utilizes optical images and ELM algorithm to train the machine to navigate around a target body. In this thesis the asteroid, Vesta, is the designated celestial body. The trained ELMs estimate the position of the spacecraft during operation with a single data set. The results show the approach is promising and potentially suitable for on-board navigation.
dc.language.isoen_USen
dc.publisherThe University of Arizona.en
dc.rightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.en
dc.subjectExtreme Learning Machineen
dc.subjectNavigationen
dc.subjectNeural Networken
dc.subjectRelative Optical Navigationen
dc.subjectSmall Bodiesen
dc.subjectAerospace Engineeringen
dc.subjectArtificial Intelligenceen
dc.titleRelative Optical Navigation around Small Bodies via Extreme Learning Machinesen_US
dc.typetexten
dc.typeElectronic Thesisen
thesis.degree.grantorUniversity of Arizonaen
thesis.degree.levelmastersen
dc.contributor.committeememberFurfaro, Robertoen
dc.contributor.committeememberButcher, Ericen
dc.contributor.committeememberGaylor, Daviden
thesis.degree.disciplineGraduate Collegeen
thesis.degree.disciplineAerospace Engineeringen
thesis.degree.nameM.S.en
refterms.dateFOA2018-09-11T03:08:22Z
html.description.abstractTo perform close proximity operations under a low-gravity environment, relative and absolute positions are vital information to the maneuver. Hence navigation is inseparably integrated in space travel. Extreme Learning Machine (ELM) is presented as an optical navigation method around small celestial bodies. Optical Navigation uses visual observation instruments such as a camera to acquire useful data and determine spacecraft position. The required input data for operation is merely a single image strip and a nadir image. ELM is a machine learning Single Layer feed-Forward Network (SLFN), a type of neural network (NN). The algorithm is developed on the predicate that input weights and biases can be randomly assigned and does not require back-propagation. The learned model is the output layer weights which are used to calculate a prediction. Together, Extreme Learning Machine Optical Navigation (ELM OpNav) utilizes optical images and ELM algorithm to train the machine to navigate around a target body. In this thesis the asteroid, Vesta, is the designated celestial body. The trained ELMs estimate the position of the spacecraft during operation with a single data set. The results show the approach is promising and potentially suitable for on-board navigation.


Files in this item

Thumbnail
Name:
azu_etd_14249_sip1_m.pdf
Size:
40.94Mb
Format:
PDF

This item appears in the following Collection(s)

Show simple item record