A Deep Learning Approach to Autonomous Relative Terrain Navigation
Author
Campbell, TannerIssue Date
2017Keywords
Artificial IntelligenceAutonomous Navigation
Convolutional Neural Network
Deep Neural Network
Relative Terrain Navigation
Spacecraft GNC
Advisor
Furfaro, Roberto
Metadata
Show full item recordPublisher
The University of Arizona.Rights
Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.Abstract
Autonomous relative terrain navigation is a problem at the forefront of many space missions involving close proximity operations to any target body. With no definitive answer, there are many techniques to help cope with this issue using both passive and active sensors, but almost all require high fidelity models of the associated dynamics in the environment. Convolutional Neural Networks (CNNs) trained with images rendered from a digital terrain map (DTM) of the body’s surface can provide a way to side-step the issue of unknown or complex dynamics while still providing reliable autonomous navigation. This is achieved by directly mapping an image to a relative position to the target body. The portability of trained CNNs allows “offline” training that can yield a matured network capable of being loaded onto a spacecraft for real-time position acquisition. In this thesis the lunar surface is used as the proving ground for this optical navigation technique, but the methods used are not unique to the Moon, and are applicable in general.Type
textElectronic Thesis
Degree Name
M.S.Degree Level
mastersDegree Program
Graduate CollegeAerospace Engineering