Determining Subarctic Peatland Vegetation Using an Unmanned Aerial System (UAS)
AffiliationUniv Arizona, Dept Ecol & Evolutionary Biol
Keywordsunmanned aerial system (UAS)
artificial neural network
MetadataShow full item record
CitationPalace, M., Herrick, C., DelGreco, J., Finnell, D., Garnello, A. J., McCalley, C., ... & Varner, R. K. (2018). Determining subarctic peatland vegetation using an unmanned aerial system (UAS). Remote Sensing, 10(9), 1498.
RightsCopyright © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Collection InformationThis item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at email@example.com.
AbstractRising global temperatures tied to increases in greenhouse gas emissions are impacting high latitude regions, leading to changes in vegetation composition and feedbacks to climate through increased methane (CH4) emissions. In subarctic peatlands, permafrost collapse has led to shifts in vegetation species on landscape scales with high spatial heterogeneity. Our goal was to provide a baseline for vegetation distribution related to permafrost collapse and changes in biogeochemical processes. We collected unmanned aerial system (UAS) imagery at Stordalen Mire, Abisko, Sweden to classify vegetation cover types. A series of digital image processing routines were used to generate texture attributes within the image for the purpose of characterizing vegetative cover types. An artificial neural network (ANN) was developed to classify the image. The ANN used all texture variables and color bands (three spectral bands and six metrics) to generate a probability map for each of the eight cover classes. We used the highest probability for a class at each pixel to designate the cover type in the final map. Our overall misclassification rate was 32%, while omission and commission error by class ranged from 0% to 50%. We found that within our area of interest, cover classes most indicative of underlying permafrost (hummock and tall shrub) comprised 43.9% percent of the landscape. Our effort showed the capability of an ANN applied to UAS high-resolution imagery to develop a classification that focuses on vegetation types associated with permafrost status and therefore potentially changes in greenhouse gas exchange. We also used a method to examine the multiple probabilities representing cover class prediction at the pixel level to examine model confusion. UAS image collection can be inexpensive and a repeatable avenue to determine vegetation change at high latitudes, which can further be used to estimate and scale corresponding changes in CH4 emissions.
NoteOpen access journal
VersionFinal published version
SponsorsNational Science Foundation (NSF) ; National Aeronautics and Space Administration (NASA) [NNX14AD31G, NNX17AK10G]; NSF ; University of New Hampshire's Hamel Center for Summer Undergraduate Research Abroad (SURF Abroad)
Except where otherwise noted, this item's license is described as Copyright © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).