Introduction
To map the study area (shown in Figure 1) both before and after the prescribed burn, a C-Astral Bramor equipped with a Field of View PPK and MicaSense Altum multi-spectral sensor, was used. To learn more about the C-Astral Bramor, visit Alan Pecor's blog where he has tutorials on various applications and components of the platform. A larger area within the Doak property line was captured with the Bramor for each flight, however, to speed up processing for purposes of this demonstration, the yellow Classification Study Area subset was used. Upon collecting the data and returning to the lab, the PPK information and images were processed in EZ Surv and Pix4D to create two multi-band orthomosaics. Since the Altum is capable of sensing RGB, "RedEdge", NIR, and Thermal IR, multiple band stack orthomosaics were created, however, the composite stacks consisting of: Red, Green, Blue, RedEdge, and Near Infrared bands, were used for these classifications. ArcGIS Pro was used to perform the following data manipulations.
The spectral and spatial detail parameters were set to 17 and 10 respectively, which decreased the variance in color between segments ( Figure 5) - a step in the right direction for training a classifier. The minimum segment size was then increased by 20 pixels until the segment size appeared appropriate for diminishing noise upon training the classifier and in the subsequent classification as well. The minimum segment size that was decided on was 80 pixels (Figure 6). Once both images were segmented, the "Classification" tool was used. The first steps in classification were to create a "Classification Schema", add classes (shown in Figure 7), and collect "Training Samples" for each class (Figure 8). These training samples provided the classification algorithm with pixel information to determine the appropriate class for the rest of the segments in the composite.
Upon collecting samples for each composite, the next step was to train the classifier; in this case, a "Support Vector Machine" was used (Figure 9). The maximum number of samples per class was left at the default 500, but only 65 samples were collected for each class anyway, so it was irrelevant.
The output was created to have the same classification as before, only this time, an attribute table containing pixel counts for each class was generated. Within each attribute table a new column was created and named "Area" (Figures 11 and 12). The column was added to the attribute table with <NULL> values in its rows. To generate the area for each class, the ground sampling distance (GSD) was obtained from the layer properties under the "Source > Raster Information" tabs (Figure 13). Then, using the "Calculate Field" tool in the attribute table window (Figure 14), the area was calculated by multiplying the cell size GSD shown in green of Figure 13. ResultsDiscussionAs shown in the resulting maps and confusion matricies, the classification turned out well. Figures 15 and 16 do show some noise and misclassification, but also mostly among areas outside of burn plots. In cell (8 , 7) for Figures 19 and 20, the combined accuracy of the user and program was 80% and 86% respectively. Considering the lack of refining for training samples and reclassification done in this project, the initial results turned out quite well. For purposes of brevity in demonstrating the methods presented in this project, the accuracy assessment of samples were only ground-truthed with the input imagery and my novice forestry knowledge. While this is an efficient way of getting the job done, it is recognized that it is also not the best way to assess accuracy of classification. One option for ensuring accurate classification in future work would be to use geolocated samples of various species before flight, so that once added as a spatial layer, could be used to collect better samples of known vegetation types. Another option would be to obtain the spectral signature of various known species and preform a classification in a more robust software. If nothing else, a thorough reclassification of commonly misclassified segments could be performed in additional steps not taken in this demonstration. Looking at Figures 17 and 18, the pixel counts and calculated areas for each class could also be used in further analysis of this imagery. By clipping areas of the classified raster where, for example a section of burn area was missed, the analyst could then obtain the pixel counts and area information for each class within that clipped layer. Outside of wildland fire, utilizing classification and pixel counts to quantify crown widths, species counts, and other information is incredibly beneficial to data collection practices. *Reclassification step: As shown in Figure 9, the training samples manager contains a pixel count option for the output attribute table, however after generating the initial classified image, the attribute table did not contain this information. It was discovered that running the "Reclassify" tool and using the same values for each class was a workaround for this issue. Once the "reclassified" image was generated, the pixel count column appeared and the area calculation was executable. ConclusionWhen analyzing the success of a prescribed burn, mapping species coverage, or calculating stand dimensions, the use of multi-spectral UAS imagery for classification is an effective method for obtaining in-depth information necessary in land management. While multi-spectral imagery is certainly not required to perform classification, it does provide a greater selection of analytical opportunities depending on what is needed. Some things to consider for future work in classifying of UAS imagery would be to 1) collect a few coordinates for each class prior to leaving the site, 2) compare with other classification algorithms in various software, and 3) overlay NDVI and other indices to observe any correlation between the success of a burn area, and the species and health of the vegetation.
While there is much work to be done in terms of developing the most effective methods for UAS in forestry practices, it is encouraging to know that advancements in UAS and GIS technology will expedite solutions to long-endured problems in land management.
0 Comments
Leave a Reply. |
Zach MillerWelcome to my field blog! Here you will find the latest updates on what geospatial projects I'm working on. I also provide in-depth workflows and explainations of how I use different functions in various GIS and UAS software to create deliverables. Have a look around by using the categories side bar or just scroll through. ArchivesCategories |