Determining Subarctic Peatland Vegetation Using an Unmanned Aerial System (UAS)
Name:
remotesensing-10-01498.pdf
Size:
12.39Mb
Format:
PDF
Description:
Final Published Version
Author
Palace, MichaelHerrick, Christina
DelGreco, Jessica
Finnell, Daniel
Garnello, Anthony
McCalley, Carmody
McArthur, Kellen
Sullivan, Franklin
Varner, Ruth

Affiliation
Univ Arizona, Dept Ecol & Evolutionary BiolIssue Date
2018-09-19Keywords
unmanned aerial system (UAS)artificial neural network
mire vegetation
Stordalen
tundra
drone
classification
Metadata
Show full item recordPublisher
MDPICitation
Palace, M., Herrick, C., DelGreco, J., Finnell, D., Garnello, A. J., McCalley, C., ... & Varner, R. K. (2018). Determining subarctic peatland vegetation using an unmanned aerial system (UAS). Remote Sensing, 10(9), 1498.Journal
REMOTE SENSINGRights
Copyright © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).Collection Information
This item from the UA Faculty Publications collection is made available by the University of Arizona with support from the University of Arizona Libraries. If you have questions, please contact us at repository@u.library.arizona.edu.Abstract
Rising global temperatures tied to increases in greenhouse gas emissions are impacting high latitude regions, leading to changes in vegetation composition and feedbacks to climate through increased methane (CH4) emissions. In subarctic peatlands, permafrost collapse has led to shifts in vegetation species on landscape scales with high spatial heterogeneity. Our goal was to provide a baseline for vegetation distribution related to permafrost collapse and changes in biogeochemical processes. We collected unmanned aerial system (UAS) imagery at Stordalen Mire, Abisko, Sweden to classify vegetation cover types. A series of digital image processing routines were used to generate texture attributes within the image for the purpose of characterizing vegetative cover types. An artificial neural network (ANN) was developed to classify the image. The ANN used all texture variables and color bands (three spectral bands and six metrics) to generate a probability map for each of the eight cover classes. We used the highest probability for a class at each pixel to designate the cover type in the final map. Our overall misclassification rate was 32%, while omission and commission error by class ranged from 0% to 50%. We found that within our area of interest, cover classes most indicative of underlying permafrost (hummock and tall shrub) comprised 43.9% percent of the landscape. Our effort showed the capability of an ANN applied to UAS high-resolution imagery to develop a classification that focuses on vegetation types associated with permafrost status and therefore potentially changes in greenhouse gas exchange. We also used a method to examine the multiple probabilities representing cover class prediction at the pixel level to examine model confusion. UAS image collection can be inexpensive and a repeatable avenue to determine vegetation change at high latitudes, which can further be used to estimate and scale corresponding changes in CH4 emissions.Note
Open access journalISSN
2072-4292EISSN
2072-4292Version
Final published versionSponsors
National Science Foundation (NSF) [1063037]; National Aeronautics and Space Administration (NASA) [NNX14AD31G, NNX17AK10G]; NSF [1241037]; University of New Hampshire's Hamel Center for Summer Undergraduate Research Abroad (SURF Abroad)ae974a485f413a2113503eed53cd6c53
10.3390/rs10091498
Scopus Count
Collections
Except where otherwise noted, this item's license is described as Copyright © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).