Saturday, July 30, 2016

Exercise 9: Create a Script Tool

Goal: This exercise focused on creating a custom script that generates a swiss hillshade of any inputted DEM.


Methods: In this exercise Python and ArcMap were both utilized in the creation of a script that allows for any DEM to be hillshaded. The steps involved include: Importing system and Python Modules, setting up variables and loops, dividing DEM, calculating hillshades, add DEM, setting up except statement, creating a custom toolbox in ArcMap, creating a map in ArcMap.


Results: The script (Figure 1) and map highlighting the finished tool (Figure 2) can be found below.


 
Figure 1 - Script showing commands for hillshade tool
Figure 2 - Map showing off the result of using the hillshade tool.
The “Oregon Spatial Data Library” Accessed in 2016 from http://spatialdata.oregonexplorer.info/geoportal/catalog/main/home.page
USGS National Map Accessed in 2016 
Hupy, C. (2016, July). Module 2 - Exercise 3. Retrieved July, 2016, from http://www.uwec.edu/Staff/hupycm/  

Thursday, July 28, 2016

Exercise 8: Raster Processing with Booleans and Loops

Goal: The goal for this second to last exercise is to utilize a suite of loops to not only clip but also project a set of USGS topographic sheets to create a single, seamless raster.


Methods: As with most of the past exercises the goal was to use arcpy to finish this exercise with a little help from arcmap at the end to see the end product. The create the seamless raster from the four sheets a number of steps were done including, importing system and python modules, the creation of variables, creating references, creation of a loop, creation of raster list, formatting raster names, projecting, merging tiles, clipping raster, creating a search cursor, merging rasters. With these tools and commands the creation of a seamless raster was made possible while also projecting them together.


Results: The code for this exercise can be found below (Figure 1) the map showing the seamless raster can be found in (Figure 2)




Figure 1 - Code for Exercise 8




Figure 2 - Seamless raster from four USGS sheets.


USGS National Map Accessed in 2016 http://nationalmap.gov/





Tuesday, July 26, 2016

Exercise 7: Risk Model for Landslide Susceptibility in Oregon

Goal: The goal for this exercise was to use python to generate a risk model regarding landslides in NW Oregon.




Methods: This exercise used data from our previous exercises and went further in order to create a risk model to be used in predicting landslides. Numerous tools were used including the creation of a fishnet, buffer, intersects, reclassifying of rasters, zonal statistics and also various raster tools to extrapolate a risk raster.




Results: The code and resulting map for this exercise can be found below - Code (Figure 1) Map (Figure 2.) This map shows areas in NW Oregon with varying levels of risk concerning landslides. This can be valuable when selecting locations to live, insurance purposes, disaster preparedness, etc.
Figure 1 - Code

Figure 2 - Risk Model Result from Figure 1
Sources:
The “Oregon Spatial Data Library” Accessed in 2016 from http://spatialdata.oregonexplorer.info/geoportal/catalog/main/home.page
USGS National Map Accessed in 2016 

Monday, July 25, 2016

Exercise 6: Analyzing Raster Data for Landslide Susceptibility in Oregon

Goal: Continuing to use PyScripter and use numerous raster and vector tools to find common landslide characteristics throughout Oregon.


Methods: For this exercise we again utilized PyScripter to create a script from start to finish that focused on analyzing areas of Oregon that commonly have landslides. Numerous factors were examined including precip values, slope and land use along with tools such as buffer, adding a field, calculating a field, creating an update cursor and zonal statistics. With this data and tools a script was created to be used in looking at landslides in Oregon and use it to locate areas that are more susceptible than others. With the way this script was created it is portable and can be used in other scenarios aswell.


Results: The completed script can be found below (Figure 1). An additional product was an excel file that shows a variety of land types in Oregon that have had landslides in the past. The only issues I ran in to were debugging some indentation errors that were extremely simple to fix and it made sense to me why the script was not running to completion.





Figure 1 - Completed Script



Hupy, C. (2016, July). Module 2 - Exercise 6. Retrieved July, 2016, from http://www.uwec.edu/Staff/hupycm/ 
Sources:
The “Oregon Spatial Data Library” Accessed in 2016 from http://spatialdata.oregonexplorer.info/geoportal/catalog/main/home.page
USGS National Map Accessed in 2016 



Tuesday, July 19, 2016

Exercise 5 - Preparing Raster Data for Analysis using Raster Lists and Loops.

Goal: For this exercise the primary goal was build off of the previous exercise and use tools such as project, clip and running basic analysis such as hillshade and slope while also using FOR IN loop.




Methods: For this exercise a new data set was downloaded. The exercise focused on the State of Oregon and was processed with strictly python. There were a number of commands used to accomplish the following, importing system modules, python modules, ArcGIS extensions, variable creation, raster lists, FOR IN loop, raster name formatting, projecting, clipping, hillshade creation, tile merges.




Results: The images below (Figure 1) show the code after it being debugged and ran successfully. Scripts like this are immensely effective if and when dealing with many rasters.



(Figure 1) - Final Script






















Hupy, C. (2016, July). Module 2 - Exercise 5. Retrieved July, 2016, from http://www.uwec.edu/Staff/hupycm/   

Monday, July 18, 2016

Exercise 4: Working with Fields and Selections

Goals: The goal of this exercise was to continue building off of the previous exercises and in this one add a field, calculate a field and apply an SQL statement successfully using Python.


Methods: For this exercise, data from the previous one was used again to achieve further processing of that same data set. With the lines of code below (Figure 1) two fields were added to the attribute tables and the calculate field tool was used on them as well. These two fields add area in km and snow compactness as was done in previous exercises using model builder. After the code was typed out, it was checked for errors and common mistakes such as typos and missed capitalizations and then run, in this case successfully!


Results: The code for this exercise can be found in (Figure 1)




(Figure 1) Code for this Exercise






Hupy, C. (2016, July). Module 2 - Exercise 4. Retrieved July, 2016, from http://www.uwec.edu/Staff/hupycm/  

Friday, July 15, 2016

Exercise 3 - Introduction to Geoprocessing Using Python

Goal: The goal for this exercise was to edit an existing model after exporting it as a script and add variables to improve upon it and then run the script successfully.


Methods: The first step of this exercise was opening up the model created in Exercise 1 and export it to be edited and used again. The model was exported as a python script through ArcMap. Using Pyscripter the code was edited and tweaked. This involved assigning variables names and linking them to the geodatabase and doing a similar process for output variables. Once complete the script was ran and results were examined. The script created can be found below.




Hupy, C. (2016, July). Module 2 - Exercise 3. Retrieved July, 2016, from http://www.uwec.edu/Staff/hupycm/  

Tuesday, July 12, 2016

Exercise 2 - Advanced Model Builder

Goals: Exercise 2 was intended to expand our knowledge of Model Builder and really highlight some of the features we may have not known before. The main focus of this exercise was using model iterator along with inline variable substitution. Again, we dealt with Ski Resorts and the varying levels of difficulty each run had. There were a number of variables and using our newly learned skills we were to process each tool on the various datasets a number of times.


Methods: Using data provided by Professor Hupy we were to process it using simply Model Builder (Figure 1). The dataset included ski run data from three different resorts. The first item added was the iterate feature class tool and from there connecting the workspace was required. Once inputted, a 10m buffer was added, giving each run a buffer. From there the Zonal Statistics Tool was added, this was used to extract data from the rasters of each run. After, the join field tool was used to join the output tables which allowed us to evaluate the data of each run. Lastly, the select tool was used with an SQL Wildcard Sequence which sorted each run into Beginner, Intermediate, Advanced and Expert.
Figure 1 - Model Created







Results: The map below (Figure 2) shows the various runs and their corresponding level of difficulty. This exercise took review skills from Exercise 1 and added analytical thinking skills along with some more advanced techniques in order to process numerous data sets in an efficient manner using one model. It is especially important to make sure when naming outputs, consistency and diligence is used in order to avoid errors.


Figure 2 - Map Created from Processed Data










Hupy, C. (2016, July). Module 1 - Exercise 2. Retrieved July, 2016, from http://www.uwec.edu/Staff/hupycm/   

Monday, July 11, 2016

Exercise 1 - Model Builder and Geoprocessing Review

     Background: The purpose of this exercise was to review some of the basic skills that go into geoprocessing and creating efficient models using ArcMap and Model Builder.


      This exercise involved the analysis of various data sets, each contributing to factors involved with selecting the ideal location for a ski resort. The location of this ski resort will fall in the Rocky Mountain range and factors that need to be addressed include: annual snowfall, low temperatures conducive to snow, within National Forest Land and also with the consideration of airports nearby.


     As the main focus of this class is to enhance our geoprocessing skills through coding and model building, the entire exercise was to be done using Model Builder. For those who aren't familiar with Model Builder, it is a powerful tool that can be used to create models that ultimately assist users in running processes and executing tools in a streamlined manner. There is a slight learning curve but once familiar, models become less difficult and raise efficiency levels.


     Methods: As this was a review exercise, very basic tools and tasks were required to get us all familiar again with Model Builder. The model below (Figure 1) shows the steps needed to achieve the end result as the data sets provided were extremely detailed and not catered just to the goals of our exercise. As mentioned earlier, there was a number of criteria that needed to be met so popular tools used in narrowing down the locations were: Clip, Select, Intersect and Buffer. With it being a rather basic exercise, no additional extensions of ArcMap were needed.





Figure 1 - Complete Model
     




Results: The map below (Figure 1.2) shows the end result after the model had been checked for errors and ran. Some fine tuning and some cartographic items such as a North Arrow and Scale were added to assist viewers in interpreting the map. Overall the model builder review went well and I fortunately, did not encounter any true obstacles.


Figure 1.2 Final Map Showing Potential Locations










Hupy, C. (2016, July). Module 1 - Exercise 1. Retrieved July, 2016, from http://www.uwec.edu/Staff/hupycm/