Sunday, June 17, 2018

Module 5: Geoprocessing with ArcGIS

This week in GIS Programming we focused on using Model Builder within ArcMap/ArcCatalog to create a model and then exporting the model as a stand-alone Python script.

Model Builder is a visual representation of inputs/outputs and tools used to complete geoprocessing in an ArcGIS environment. Model Builder is an extremely useful tool for completing several geoprocessing steps in succession rather than one at a time. It is possible to visually create inputs that are run through a tool, and then have the outputs used as the new inputs for the next tool. This process is what we worked on this week. We were given various data and asked to create a model that would Clip, Select and then Erase certain data from a shapefile. 

Once the model was completed in the ArcGIS environment, we were required to export the model as a stand-alone Python script that could be shared using the script itself and a toolbox from ArcGIS. Though a model runs successfully in ArcGIS, the exported Python script will not run successfully without setting parameters and some additional settings. Mainly, file paths have to be checked for accuracy, the workspace environment has to be re-specified (this is done in ArcGIS but does not translate to the script), and the script must also be told to "overwrite existing." That last part gave me fits that I'll explain later. 

Pictured above is the result of my model and subsequent Python script. The shapefile of interest is the soils_Clip_Erase (grey). It is the product of clipping the provided soils.shp to the area of the basin.shp showed in pink, Selecting "Not prime farmland", and then Erasing the selection. The result is a subset of the soils shapefile that covers the same area as the basin and only contains areas suitable for farming. These files are the result of the Python script, not the model. The original files created by the model were overwritten by design when the Python script was run. The following is the flowchart I used to help guide the process of creating the model and script.

And this leads me to the major lesson I learned from the module. Adding a print "Success" statement to the end of the code that is meant to overwrite existing files is EXTREMELY useful. I ran the script countless times not realizing it was succeeding because the file names weren't changing and I wasn't getting any messages from the interactive PythonWin window. It was working, but I just couldn't tell. Once I added the print "Success" statement and ran the script again, Success printed to the interactive window I knew I was good to go. 

Wednesday, June 13, 2018

Applications in GIS: Hurricanes

This week in Applications in GIS we focused on mapping the track and damage of Hurricane Sandy. Hurricane Sandy affected the New Jersey area in October of 2012. Our task this week was to create two maps, one was to show the track of Hurricane Sandy with wind speed, barometric pressure information, and states that declared for FEMA aid. The other map would show pre and post-storm imagery along with damage assessment from inspection of the images.

The first map, pictured below, was created using X and Y data from a table. X and Y coordinates can be used in ArcMap to plot points as long as the appropriate coordinate system is selected, and the X and Y values are identified properly as X representing longitude and Y representing latitude. Once the points were plotted, the Points to Line tool was used to create the track pictured on the map. After that, it was a matter of making appropriate cartographic choices to represent all the information required on the map.

For the Damage Assessment map, we were required to create raster mosaics in order to compare pre and post-storm images. Once the images were compiled, the slider and flicker tools (found on the Effects Toolbar) were used to compare the images for damage assessment. Utilizing the aforementioned tools, an editing session and create features, each parcel along a chosen street within the damage assessment study area was classified. To streamline the process and prevent erroneous inputs, subtypes and domains were designated prior to beginning the damage assessment. 
In addition to the creation of the above maps, we also delved deeper into the functionality and utility of file geodatabases, creating two geodatabases and using built-in functionality to designate domains, etc. for the damage assessment. 

Monday, June 11, 2018

Peer-Review #1

Ashlee Malone

GIS 5103

Peer-Review Assignment #1

            In the article, “Integration of Beach Hydrodynamic and Morphodynamic Modelling in GIS Environment,” by Silva and Taborda (2013), they describe the process of creating a tool within the ArcMap environment to model beach hydrodynamics and morphodynamics. The model, Beach Morphodynamic Model (BeachMM), was developed utilizing Python scripting to automate the integration of results from SWAN and XBeach data and is meant to provide data to aid in decisions that affect coastline management. According to Silva and Taborda (2013), “The developed tool aims to simplify the simulation procedure, automate the data flow between predictive models and GIS and graphically display the results with minimal user interaction.”  The presentation and justification of the BeachMM by its creators is thorough and follows a logical order while also providing real-world examples of the model’s applicability.  
            Silva and Taborda (2013) begin the presentation of the BeachMM by explaining the parts of the model that have been combined in a GIS environment using Python scripting. A sufficient breakdown of the purpose and function of both the SWAN and XBeach data models are presented followed by an explanation of why Python was chosen for tool creation. In order to justify the need for the model, it is explained that traditionally, software outside of a GIS has been utilized to present the combined data which comes with additional costs from both software and personnel (Siliva & Taborda, 2013). The model creators argue that the model they have created reduces cost by eliminating the need for additional software and also reduces possible human error through Python scripting and therefore automation of processes. Though confident in their model, Silva and Taborda offer a word of caution about blindly trusting the automated process and this shows an interest in maintaining data integrity which is critical in GIS.
            The article concludes with maps and outputs for a region in Portugal known as Norte Beach using BeachMM. Results produced by the BeachMM further supports Silva and Taborda’s (2013) standpoint that there was a need to create a model that would combine SWAN and XBeach data in a GIS. Also included in the article are screenshots of model use in ArcMap. Providing examples of the output from the BeachMM for a known region and displaying the ease of use inherent in the BeachMM due to integration with a program used by GIS professionals, presents a strong case for the BeachMM model and its utility.
            Silva and Taborda (2013) provide sufficient background information on the parts, integration and need for the model they present in their article. The logical flow of the article from what makes up the BeachMM model, to how and why Python was utilized in its creation, and ultimately supported by an example of real-world application results in an article with an explanation of processes that is easy to find value in.
Silva, A. M., Almeida, Nobre, Taborda, R. P., & Matos. (2013). Integration of beach hydrodynamic and morphodynamic modelling in a GIS environment. Journal of Coastal Conservation, 17(2), 201-210. doi:

Saturday, June 9, 2018

Module 4: Debugging and Error Handling

This week in GIS Programming we moved on from Python fundamentals and learned how to debug scripts and some techniques to resolve errors when they occur. There are three main types of errors that can occur while writing scripts, syntax errors, exceptions and logic errors. For this week we focused on fixing/identifying syntax errors and exceptions.

Syntax errors prevent scripts from being compiled and run, so it is essential to be able to identify the error and location in order to repair and run scripts. One method of identifying the location of errors is to review error messages provided by the Python Interpreter in use. Often the line the error occurs on will be identified in error messages, so it is useful to turn on the line numbers within the interpreter. Syntax errors are often spelling, punctuation or indentation errors.

Exceptions occur while a script is running so, unlike syntax errors, exceptions will cause a script to stop running in the middle or cause the program to crash. Instances of exceptions can be identified by utilizing the debugger tools when running scripts in PythonWin. This tool allows for "stepping" through the code line by line to identify where the exception is occurring. Once identified, because exceptions are said to be thrown by the script, they can be caught or trapped by using try-except statements. Try-except statements allow for scripts to continue running despite the presence of an exception.

Pictured below is a screenshot of the results from the three scripts provided this week. The first two scripts contained errors that prevented them from running unto completion as shown. We were required to find the errors using the methods described above and get the scripts to run properly. The third script (results start with "Running Part A") required the use of a try-except to trap an exception that was occurring during the running of the script. This last task presented a few challenges for me because I was not being careful with indentation when I introduced the try-except statement into the script. As mentioned above, indentation errors are a type of syntax error so my script would not run at all until I resolved the issues.

Tuesday, June 5, 2018

Applications in GIS: Tsunami

This week in Applications in GIS we focused on damage assessments. Specifically, we analyzed data from the 2011 earthquake and tsunami that affected the NE coast of Japan. In March of 2011, Japan was impacted by a 9.0 earthquake and subsequent tsunami. Data in the form of raster DEM's and vector point data (representing power plants, cities, and roads) was analyzed to produce this week's map.

As seen in the map, radiation zones, cities and populations affected with medical consequences based upon distance were derived from the provided data. Populations up to 50 miles surrounding the Fukushima II power plant were affected when the earthquake and tsunami caused failures at the plant. This output was created using multiring buffers originating from the plant and the select by location tool to pull the affected cities from a larger city database. 

Also pictured is a representation of the runup zones created by the tsunami. This dataset was created with the use of raster DEM files, cities, roads and power plant data and the implementation of a tool created in Model Builder. The tool was created to streamline the process of creating 3 separate runup zones with the intersecting road, city, and power plant data. 

Saturday, June 2, 2018

Module 3: Python Fundamentals Part II

This week we continued foundational concepts in Python. Goals for this week included learning about importing modules, correcting script errors, creating loops and conditional statements, and adding comments to scripts.

The assignment for this week required the use of all of the above-mentioned topics.

Above is a screenshot of the results of my final script for this week. The assignment began by requiring that we identify an error in the dice game that appears first in the results. I corrected the errors by reviewing the syntax errors produced by PythonWin when a process cannot run correctly.

Next, we were required to create a list of 20 randomly created integers between the numbers of 0-10. This was accomplished by importing the random module ( so that the randint function would be available), and the use of a while loop statement. While loops require the creation of a sentry variable, without the sentry the loop will run continuously without stopping. This happened to me several times while attempting the assignment. Once the sentry variable is reached, in this case, once 20 integers had been added to the list, the loop stopped and no additional integers were added. Pictured below is a flowchart representing these steps.

Finally, we were required to create a conditional statement that would print a response based upon the presence of an integer of our choice. I chose the number 3 and made this the unlucky number as seen above in my results. With the unlucky number identified, I used an if else statement to print the appropriate response based on the presence of the number 3 in the list. To end the assignment we were required to script a process that would remove the unlucky integer (if present) and count how many times it was removed. I used another while statement and the count function to produce the last two lines of my results.

This week was quite challenging but I feel that I have a better understanding of the concepts present after being required to use several of the concepts introduced in one assignment.

Monday, May 28, 2018

Applications in GIS: Lahars

This week's module focused on identifying lahar hazard areas around Mt. Hood, Oregon. Lahars are flowing mixtures of water, rocks, and debris that can result from volcanic eruptions. Mt. Hood has been dormant since the mid-1800's and the area surrounding the volcano has become heavily populated. The goal this week was to identify stream features from USGS raster data to determine hazard areas surrounding Mt. Hood. Identifying areas with natural streams can help predict areas and populations that will be affected the most in the event that Mt. Hood becomes active.

In order to identify lahar hazard areas, raster data were analyzed first using the Hydrology toolset in ArcMap. Fill, flow direction, and flow accumulation tools were used primarily to identify the areas of natural streams surrounding the volcano. The fill tool helps to correct any small discrepancies in the data and prevents flow from pooling up in any areas. The flow direction tool then determines the direction of water flow across cells of a raster image by giving direction to the lowest cell (lowest elevation). Finally, flow accumulation was used to determine how many cells flow into other particular cells and this was final step in determining where streams were located around Mt. Hood.

With possible flow paths determine, a buffer of 1/2 mile was created to surround the streams and help determine which areas and populations would be affected the most. US Census data from 2010 was used to help identify the total population that could be located in hazard areas. Block groups that fell within the 1/2 mile buffer of stream features were considered in hazard areas. Any schools within the 1/2 mile buffer were also identified and placed on the map. Block groups were identified by using select by location (within a distance of 1/2 mile) and using the stream feature as the selection parameter.

The map for this week was challenging from a cartographic standpoint. There was a lot to put on the map and identifying features against elevation color ramps is not easy. It took a lot of time to choose colors, transparency levels etc. Overall I am satisfied with the way the map turned out and I feel that the items on the map are easy to identify and the map fulfills its intended purpose, showing lahar hazard areas around Mt. Hood.