Wednesday, June 29, 2016

Lab 6: Applications in GIS

This week's lab focused on crime analysis.  The three types of to determine crime hotspots in Albuquerque was Grid-based thematic mapping, Kernel Density, and Local Moran's I. After performing all three analysis, the next step was to compare which hotspot analysis is best for predicting future crime.

To create the map below, I first performed the grid-based thematic mapping.  To find the hotspots by using grid-based thematic mapping, I did a spatial join of the grids with the 2007 burglaries.  I selected the grids that contains at least 1 burglary and made it into a new shapefile.  I then found the top 20%.  I then dissolved the polygons to make one single polygon.  I then added a field to calculate the square kilometers.

Next I used Kernel Density to determine the hotspots.  I set the environment to only show the grids.  I then used Kernel Density tool to calculate.  The parameters for the tool used output cell size of 100 and the search radius of 1320 feet.  I kept the area units to square miles.  Next I removed the areas with 0 density.  I found the mean and used that to determine the classifications.  Once that was complete, I converted the raster to polygons.

Finally, I used Local Moran’s I to determine hotspots.  I did a spatial join of block groups and 2007 burglaries.  I then found the crime rate of burglaries to housing units.  Next I used the Cluster and Outlier Analysis script and left the parameters to the defaults.  Next I used a query to create a shapefile of just the HH polygons.  I dissolved the polygons and then found the area by using calculate geometry.

Below is a map layout of all three analysis. This helps the Albuquerque police determine where to patrol more by comparing the 2007 hotspots to the 2008 burglaries.
Hotspot Analysis

Programming: Module 7

This week's lab covered checking for data, and working with lists and dictionaries.  Another large component  of the lab involved Search Cursors, Update Cursors, and Insert Cursors.  The end result of the lab took data from a shapefile, with specific criteria and updated it in a dictionary.

Below are the results of the scripts.  The results show each step and if the process was complete.  The dictionary is over the cities that have Features of county seat.  By using the update tool, I updated the dictionary with keys and values.  The keys were the names of the cities and the values were the population from 2000.



Figuring out how to have the dictionary updated with the keys and values was the most challenging part.  I was not quite sure how to manipulate the Update Cursor so it would put the city name and the population.  After seeing an example seeing how each column was iterated as a row, I was able to finish  the script.


Wednesday, June 22, 2016

Programming: Module 6

This week's lab focused on using ArcPy functions to perform geoprocessing.  Part of the lab covered writing script in ArcMap and then using PthyonWin.

The screenshot below shows the results of the script I created.  The goal of the script was to create a buffer around the hospitals and dissolve the buffer lines into larger polygon.  In order to do this, I need to import ArcPy and then the environments.  I set the environments and then I was able to start using the geoprocessing tools.

The first one was to add XY coordinates to the hospital shapefile.  I then printed the results of the tool to ensure that it worked.  Next I used the buffer tool to create a 1000 meter buffer around the hospitals.  I left all the optional parameters blank and had the results printed.  Finally, I used the dissolve tool to join polygons that overlapped.  Once the dissolve tool was done calculating, I printed the results of the tool.

Results of Geoprocessing Script


Below is a flowchart to show how the script runs.

Script Flowchart

Lab 5: Applications in GIS

This week's lab covered spatial accessibility modeling.  To perform the analysis, the network analysis extension needed to be enabled.  A few of the tools within network analysis that were focused on this week was Closest Facility, New Service Area, and Spatial Joins.

To create the map below, I used the New Service Area option from the network analysis toolbar.  I found the service area for each of the 7 college campuses.  The tolerance was 5000 meters and it used breaks of 5, 10, and 15 minutes.  After clicking the Solve button, I was able to see the results of the service areas.

Next, I created a new shapefile that excluded Cypress Creek campus.  I reran the New Service Area for the 6 college campuses.  I used the same setting as I did for find the service areas for the 7 campuses.  I converted the block group shapefile to centroids by using the Feature to Point tool.  By doing this, I was able to see how the service areas overlay with the block groups.

The results show how the service area decreases if the Cypress Creek campus closes.
Campus Service Areas

Tuesday, June 14, 2016

Lab 4: Applications in GIS

This weeks lab covered visibility analysis.  Visibility analysis can be applied to a lot of different situations such as observation points or fire towers.  This weeks lab focused on viewshed analysis, and observation point tools.  Part of the lab covered visibility analysis using 3D Analyst and the LAS Dataset tool.

The first part of the lab used the Viewshed tool with the summit points and the elevation raster.  I also used the Observer Point Tool with the same inputs.  Once I had the outputs, I used Extract Values to Points to determine which summit is viewed by the most observation points.

Polyline Visibility Analysis
The next part of the lab used polylines to determine which areas of Yellowstone National Park are visible from the roads.  The inputs used the roads polyline shapefile and the elevation raster.

Part three of the analysis uses the 3D analyst extension.  This allowed for me to see a 3D model of city of Boston.  Once the streetview was selected, I was able to rotate the model to see all angles.  Next, I used the LAS Dataset to Raster tool.  This creates a new finish line raster.  I added the camera shapefile and used the viewshed tool to see how much area is visible by the one camera.  I then adjusted the offset so the camera was considered elevated and could see around the buildings.

It was important to then determine the start and end angle.  This allowed for a more realistic visible area.  I added two more cameras and performed the same analysis.  After I had the viewshed, I adjusted the symbology to show what area is seen by 1 to 3 cameras.

The last part of the lab covered line of sight analysis.  For this analysis, I needed the Create Line of Sight tool.  I created a line that connect two summits.  To see more details, I opened the Profile Graph.  The blue dot shows an obstruction.  I also used the Construct Sight Lines tool to create lines between all the towers.  This allowed to see which summits are visible from each summit.

Programming: Module 5

This weeks lab focused on creating a model and exporting it into a script.  The model was created in ArcMap using shapefiles and geoprocessing tools.

When models are exported into a script, not all of the data sources are populated.  It is important to adjust the parameters  and it is important to connect the data to the correct folders.  To create the model used in the script, I added a new model to the toolbox.  From there, I was able to drag the soil and basin shapefiles into the model.  I wanted to clip the soil layer with the basin later and dragged the clip tool into the model.  To erase soils that aren't ideal for farming, I added the select tool to the model.  Once the area was selected, I was able to added the erase tool.

During the whole model creating process, I had to make sure that all the parameters were set.  Another key piece to creating the model was enabling outputs to be overwritten.  The model is exported as a script.  Scripts can also be added to the toolbox to be ran in ArcMap.
Shapefile Output of Ideal Soils

Below is a flowchart that represents the process of the script that was created to clip and erase certain soils from the basin layer.



Wednesday, June 8, 2016

Programming: Module 4

This week's lab focused on debugging scripts in PyhtonWin.  In the first script, one of the ways to determine the errors was to hit the check button to see if there are any syntax errors.  If there are no syntax errors, then I ran the script.  This shows an error and what line the error is on.  After fixing the error, I ran it again.  If it resulted in an error, I looked to see the type and on what line the error fell on.  The result of the error free script prints the fields in the park shapefile.

Layer Details
The second script contained 8 errors.  To find the 8 errors, I used the Check button.  The syntax was fine so I had to run the script to see what error occurred.  Each time I found an error, I corrected it and ran the script again.  Some of the errors consisted of a bad backslash, wrong file format, missing spelling or words, and fixed the data source.  The result of the script prints all the names of the layers in the data frame.
Layers in the Data Frame

The third script contained errors.  For this script, I added a try except statement to print the errors for Part A and have Part B run successfully.  In order to do this, I had to find the errors by running the script.  Once I found one error, I put the try-except statement before the line that contained the error.  I ran the script again to determine the other error.  I used the general exception to print the errors.  Part B then successfully ran.  The result of the script prints the errors in Part A and prints Spatial Reference and Map Scale.
Try-Except Statement

The flow chart illustrates script 1.
Flow chart of Script 1



Lab 3: Applications in GIS

This week's lab covers watershed analysis for Kuauai, Hawaii.  The first step of the analysis was to use the fill tool to fill any errors in the DEM file.  This is down so the flow of the watershed is more accurate and removes the sinks.

The next step was to use the Flow Direction tool.  This tool determined all the different directions the water would flow.  The tools determines the 8 possible directions by analyzing one cell to the next.  The cell direction goes from one cell to the lowest.

The third step uses the Flow Accumulation tool.  This tool produces a layer that accumulated the cells and collects a cell count.  This shows the number of cells that would flow using the flow direction raster.  The flow accumulation for the cell represents the upstream cells that flow in that direction.  I then added a threshold of 200 cells.  This produced an output that contained steams with 200 cells or greater.

The next step was to use the Stream to Feature tool.  This coverts the streams to vector files and also maintains the direction of the streams.  By using the Stream Link tool, it allowed me to clearly identify individual streams.  Next it was important to determine the hierarchy and scale of the streams by using the Stream Order tool.  This also looks at the flow direction of the streams.

Once the previous steps are completed, I used the watershed tool.  The output showed the where the streams drain out to.  I then added a pour point to the map at a location where the stream drains to the ocean.  The watershed tool shows where the watershed drains to for that pour point.

The results of this analysis compares streams to streams of the National Hydrography Dataset.  I then also compared the modeled watershed to the NHD watershed.
Watershed Analysis

Wednesday, June 1, 2016

Lab 2: Applications in GIS

Lab 2 covered least cost path and corridor analysis.  To perform the least cost path analysis, I reclassified land cover, elevation, and the euclidean distance found around the roads.  Once the layers were reclassified, I needed to use the Cost Distance tool.  This tool finds the lowest cost path from each of the national parks.  To find the least cost path, I would need to use the Cost Path tool.

However, to determine the national park corridor, I did not need to use the Cost Path tool.  Once the Cost Distance outputs were created for both national parks, I used the Corridor tool.  The tool requires the two cost distance outputs.  Once the Corridor layer was added, I needed to adjust the symbology.  I had to determine a threshold for the corridor.  I needed to make sure the corridors were not too wide.  I adjusted the colors to include 3 levels, least to most suitable.  Least suitable is the lightest and the most suitable is the darkest.
Most Suitable Corridor

Programming: Module 3

Module 3 lab involved using modules, conditions, and loops.  To get the results of the screenshot below, I had to import the math module.  Part of the script creates a list of players and the dice game results.  Next, I created a while loop that randomly selects 20 integers ranging from 0 to 10.  I then printed one list of all the integers.  From the random numbers that were selected, if the number 8 was in the list, I had it removed.  To do this, I counted how many times the number 8 appeared in the list.  Once I had the number, I used a while loop inside a condition statement to remove the number 8 until it was gone.  By counting how many 8's there are, that determined if the while loop continued to run.  The results printed.

Script Results
I felt this lab was rather difficult.  It was a challenge to figure out the correct syntax and order of each operation.  By working through the lab, I learned quickly how important indenting is to the script.  Below is a flow chart of the script.

Flow Chart of Step 4