Tuesday, July 26, 2016

Programming: Module 10

This week's lab required me to create a toolbox and script to share for another user.  Standalone scripts are great tools, but creating a script tool has even more benefits.  Script tools are easy to share, a user doesn't need to know how to use Python, and it includes a dialog box.

The screenshot below shows the results of the script tool window that is created through this lab.  To do this, I added a toolbox to my Module 10 folder.  I then added a script to the tool box and made sure "Store relative path names" was checked.  I selected an already created script foe the Script File.  Next, I added four parameters to the script tool.  I adjusted the data type and properties and set the input and output file location.  When I open the tool, the window below opens.

Tool Options
The next step in the lab was to adjust the parameters in the standalone script.  I replaced the filenames and file paths with arcpy.GetParameter().  The parameters correspond to the order I added int he script properties.  In order to run the script without an error, I had to add str() to the output folder.  Then I ran the tool with the clip boundary of Durango.shp and selected the four input features.  To print statements in the dialog box, I had to adjust the standalone script again.  I changed the print command to arcpy.AddMessage().  Results are below.

Dialog Box
To share the script, I compressed the toolbox and the standalone script.
Flowchart



Wednesday, July 20, 2016

Programming: Module 9

Raster Result
This week's lab I worked with raster files and wrote a script that identifies areas with a certain slope, aspect, and land cover type.  The script for forest land cover with a slope between 5 to 20
degrees and aspect of 150 to 270 degrees.

I imported arcpy, and sent the environment.  I also imported tools from ArcPy Spatial Analyst. The script checks out the spatial extension and makes sure its available.  I also made sure to have the overwrite function enabled.  The script reclassified the forest land cover so it would have a value of 1.  Next the script found the slope of the elevation shapefile.

Then the aspect tool is used to find the aspect of the elevation shapefile.  The script finds the cells with a slope less than 20 and greater than 5.  The script also found the cells with an aspect of 150 to 270 degrees.


Finally the script combines the slope and aspect requirements.  If spatial analyst was not available, the script would print the license is not available.

Below is a flow chart of the script and also a screenshot of the result of the script.

Lab 8: Applications in GIS

This week's lab focus on damage assessment. The lab mainly focused on Hurricane Sandy. The map that was created at the end of the lab shows the path the hurricane took and what category of hurricane it was at that point in time. To create this map, I added the world and US shapefiles to ArcMap. I needed to select only the states affected by the hurricane, so I used a select by attribute to select the states. I added the hurricane points to the map and added XY data. To create a path of the hurricane, I used the point to line tool.

Next, I needed to adjust the hurricane symbology to look like a hurricane. To do this I had to edit the symbol's properties. I had to change the symbols to ESRI Meteorological. I found the symbol that looked like a hurricane, but then used the angle setting to tilt it. I also added a center dot on top of the symbol as well. I also changed the color of the symbol to red. I saved the new symbol and category under unique values.

Hurricane Sandy Path

The next step was to add graticules to the map. I selected the data frames properties and went to Grid. From there, I selected the graticule that uses meridian and parallels. Finally, just had to add the key map elements.

To perform a damage assessment on the New Jersey shoreline, I added a new feature class to place a point on each parcel. After placing a point on each parcel, I updated the points attributes. I did this for every parcel in the layer. To determine how many structures fell within 100, 200, or 300 meters of the coastline, I used the select by location tool. I also did create a buffer for each to determine how many fell within the distance from the coastline. To create the coastline I created a new feature class of a polyline that was parallel to the parcel area.





Below is the table of the result:

Structural Damage
Counts of structures within distance categories


0-100 M 100-200M 200-300M
No Damage 0 0 1
Affected 0 0 7
Minor Damage 0 14 24
Major Damage 3 19 11
Destroyed 9 10 7
Total 12 43 50

Wednesday, July 13, 2016

Programming: Module 8

Module 8 lab covered working with geometry objects and multipart features.  The script I wrote creates a text file and writes coordinates along with Object IDs for the vertices in the rivers shapefile.  To do this, I had to import the environment and shapefile.  I used a Search Cursor to get the OID, Shape geometry object, and the Name field.

I had to use the Open function to create the text file and name it as rivers_Mfelde.  In order to get the coordinates, Name, and Object IDs to the text file, I had to create a for loop to iterate over all the rows.  Next, I needed to do another for loop to iterate each point in the array.  I use the .getPart() method to access the points in the array.  I had to add 1 to the vertex ID to help keep track of the vertices.

To get the data in the text file, I had to use the .write() method.  I used the iterate of the first for loop, the vertex ID, X coordinate, Y coordinate, and then the name of the river.  To ensure that I captured everything correctly, I printed the results as well in the interactive window.  I closed the text file.

Below is the screenshot of my text file along with a flow chart that shows the process of creating the script.
Text File Results
Script Flow Chart

Lab 7: Applications in GIS

This week's lab focused on coastal flooding.  With concerns of sea level rising during storms and due to global warming, coastal flooding analysis is important to decision makers.  The map below shows the effects of flooding on Honolulu, HI.

To perform the analysis, I had to create the flood zone by using the Less Than tool to find areas less than 1.41 and 2.33 meters.  To find the area of the flood zones, I converted the rasters to polygons and used the field geometry tool to calculate the geometry.

I used the Multiply tool to multiply the DEM with the flood zones.  Then to get the Minus tool to get the flood depth.  Next I added the tract shapefile.  I added a field tool to calculate the area in square kilometers.  I also calculated the population density by taking the population divided by the area.


People over 65 are less likely to be affected by the floods compared to the other types of population.  In the 3 foot scenario, the white population make up the greatest percent of being affected by flooding.  In the 6 foot scenario, home owners are most likely to be affected by flooding.  In either scenario, people 65 and older are less likely to be affected.  Home owners do have a high social vulnerability of the three groups here.  In the 3 foot scenario, it is the second highest percentage.

Flood Depth Analysis

Monday, July 11, 2016

Peer Review #2

GIS Modeling of Intertidal Wetland Exposure Characteristics discusses the analysis of solar radiation and tidal inundation impacts on coastal ecosystems. This analysis addressed whether solar radiation and atmospheric exposure can be modeled using LIDAR derived DEM data with wetland mapping. The authors stated previous methods had limitations due to data quality. Once the data was modeled, it would provide exposure characteristics of Nova Scotia, Canada. Four methods of analysis were used, two using Python scripts.


Early on in the article, the authors of this article, Crowell, Webster, and O'Driscoll discuss the limitations of analyzing solar radiation and tidal inundation. Poor data samples/quality makes the analysis difficult, and the authors do a fair job at showing how GIS analysis along with Python can simplify and improve the desired analysis which they stated extends the localized findings.


When the authors explained how they performed the tidal inundation model analysis, it was clearly explained that it uses a predictive approach. It used a script that found the high risk areas of flood damage. Since the authors clearly stated the use of cell elevation in the LIDAR DEM raster to find the connectivity between adjacent cells. One limitation is the authors do not explain how this improves upon previous analysis. It would also have been good to know why the authors did not account for preservation of momentum or flow rate. One strong argument the authors made for a benefit of using script was the script allowed for a realistic modeling of the tides.


The authors do a good job showing how one script was able to be used alongside another script. The tidal inundation model was used with the solar exposure model. However, it was not clear what the parameters were for performing this script. The data used was from 2009. The authors did not make it clear if using older data would have an impact on the analysis. The authors could have gone into more detail of how the two scripts worked together or how the analysis was performed together.


The article does a good job explaining how the coastal wetland zone script looped through each tidal model delineation to determine the spatial overlap. The authors do a great job stating how the script looked at the lowest and highest elevation and needed to use annual atmospheric and solar-exposure characteristics.


One strength of the article is the authors explained how the script can be applied to other parameters/characteristics such as other chemicals that can impact the areas. The authors did a great job supporting this claim by giving an example of other contaminants. Another strength of the article was mentioning some limitations of the analysis. By stating irregular tides were not captured in the script, it allows the audience to make note and understand why scripts are not perfect even though being fairly accurate.


Overall, this article does a decent job showing why Python scripting can benefit the environmental analysis of solar exposure and tidal patterns. The authors made a great point of how using models to fill in gaps allows to expand upon these kind of findings. However, the authors could have provided more details of how the scripts worked or explained how the analysis would be performed if done manually. It was not completely clear of the data parameters used However, after reading the article, it is clear that using scripting alongside other analysis methods reduces analysis time and allows for far more complex analysis.  

Crowell, N., O’Driscoll, N.J., & Webster, T. (2011). GIS Modelling of Intertidal Wetland Exposure Characteristics. Coastal Education & Research Foundation, 44-51.