An Adaptive Hierarchical Approach to Lidar-based Autonomous Robotic Navigation
dc.contributor.author | Brooks, Alexander J. -W. | |
dc.contributor.author | Fink, Wolfgang | |
dc.contributor.author | Tarbell, Mark A. | |
dc.date.accessioned | 2018-08-09T19:06:12Z | |
dc.date.available | 2018-08-09T19:06:12Z | |
dc.date.issued | 2018 | |
dc.identifier.citation | Alexander J.-W. Brooks, Wolfgang Fink, Mark A. Tarbell, "An adaptive hierarchical approach to lidar-based autonomous robotic navigation", Proc. SPIE 10639, Micro- and Nanotechnology Sensors, Systems, and Applications X, 106391X (8 May 2018); doi: 10.1117/12.2303770; https://doi.org/10.1117/12.2303770 | en_US |
dc.identifier.issn | 0277-786X | |
dc.identifier.issn | 1996-756X | |
dc.identifier.doi | 10.1117/12.2303770 | |
dc.identifier.uri | http://hdl.handle.net/10150/628383 | |
dc.description.abstract | Planetary missions are typically confined to navigationally safe environments, leaving areas of interest in rugged and/or hazardous terrain largely unexplored. Identifying and avoiding possible hazards requires dedicated path planning and limits the effectiveness of (semi-)autonomous systems. An adaptable, fully autonomous design is ideal for investigating more dangerous routes, increasing robotic exploratory capabilities, and improving overall mission efficiency from a science return perspective. We introduce hierarchical Lidar-based behavior motifs encompassing actions, such as velocity control, obstacle avoidance, deepest path navigation/exploration, and ratio constraint, etc., which can be combined and prioritized to foul' more complex behaviors, such as free roaming, object tracking, etc., as a robust framework for designing autonomous exploratory systems. Moreover, we introduce a dynamic Lidar environment visualization tool. Developing foundational behaviors as fundamental motifs (1) clarifies response priority in complex situations, and (2) streamlines the creation of new behavioral models by building a highly generalizable core for basic navigational autonomy. Implementation details for creating new prototypes of complex behavior patterns on top of behavior motifs are shown as a proof of concept for earthly applications. This paper emphasizes the need for autonomous navigation capabilities in the context of space exploration as well as the exploration of other extreme or hazardous environments, and demonstrates the benefits of constructing more complex behaviors from reusable standalone motifs. It also discusses the integration of behavioral motifs into multi-tiered mission architectures, such as Tier-Scalable Reconnaissance. | en_US |
dc.description.sponsorship | Edward & Maria Keonjian Endowment at the University of Arizona; NASA via Arizona Space Grant Consortium (AZSGC) [NNX15AJ17H] | en_US |
dc.language.iso | en | en_US |
dc.publisher | SPIE-INT SOC OPTICAL ENGINEERING | en_US |
dc.relation.url | https://www.spiedigitallibrary.org/conference-proceedings-of-spie/10639/106391X/An-adaptive-hierarchical-approach-to-lidar-based-autonomous-robotic-navigation/10.1117/12.2303770.full?SSO=1 | en_US |
dc.rights | © 2018 SPIE. | en_US |
dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | |
dc.subject | Autonomous (CISR)-I-4 systems | en_US |
dc.subject | multi-tiered robotic exploration architectures | en_US |
dc.subject | navigational behavior motifs | en_US |
dc.subject | 2D Lidar data | en_US |
dc.subject | velocity control | en_US |
dc.subject | obstacle avoidance | en_US |
dc.subject | deepest path navigation | en_US |
dc.subject | ratio constraint | en_US |
dc.title | An Adaptive Hierarchical Approach to Lidar-based Autonomous Robotic Navigation | en_US |
dc.type | Article | en_US |
dc.contributor.department | Univ Arizona, Coll Engn, Visual & Autonomous Explorat Syst Res Lab | en_US |
dc.identifier.journal | MICRO- AND NANOTECHNOLOGY SENSORS, SYSTEMS, AND APPLICATIONS X | en_US |
dc.description.collectioninformation | This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu. | en_US |
dc.eprint.version | Final published version | en_US |
refterms.dateFOA | 2018-08-09T19:06:12Z |