Thursday, December 15, 2016

Processing UAV Data

Introduction
This lab is an introduction to using PIX4D to process unmanned areal vehicles (UAV) date. In the lab a few of the functions that Pix4D provides are used to help create a better understanding of the power of Pix4D. This lab will also be treated as a manual that can be referenced in the future on how to create projects, process UAV data, created orthomosaics and digital surface models (DSM). The data for this lab was collected at the Litchfield Mine by Professor Hupy prior to the lab. Before the lab started a couple of question were asked by Dr. Hupy. The questions and the answers are below.

Look at Step 1 (before starting a project). What is the overlap needed for Pix4D to process imagery?
- The proper amount of overlap depends on the terrain type, for the general cases that do not have forests, snow, lakes, agricultural fields, or any other difficult terrain to reconstruct a 75% frontal overlap and a 60% side overlap are recommended. 

What if the user is flying over sand/snow, or uniform fields?
- For rough terrains that contain forest or dense vegetation an 85% frontal overlap and a 70% side overlap are recommended. This also applies to land covered in snow or agricultural fields. Oceans are impossible to reconstruct, rivers and lakes can be reconstructed but require a landmass in every image. 

What is Rapid Check?
- Is a processing method with in Pix4D that focuses on speed over quality. The resolution of the images is reduced which in turn lowers the accuracy and may lead to incomplete results. This method is recommended for in field processing to get a quick check of the data set that was collected.  

Can Pix4D process multiple flights? What does the pilot need to maintain if so?
- Yes Pix4D can process multiple flights. The pilot needs to maintain height for both flights in order for the images to be processed properly.  

Can Pix4D process oblique images? What type of data do you need if so?
- Oblique images are taken with the camera not pointing at nadir. Nadir is the term used when the camera lens is point directly at the ground or object. This means that the camera is perpendicular to the ground. Oblique images are used to reconstruct 3D objects. An orthomosaic is not possible to be constructed because it uses a flat X,Y plane. To use oblique images a orthoplane must be created, which again are used to create 3D images. 

Are GCPs necessary for Pix4D? When are they highly recommended?
- Ground Control Points (GCPs) are optional, however GCPs improve the global accuracy of the project. GCPs are HIGHLY RECOMMENDED when processing images that lack geolocations. If GCPs are not used a few issues will occur;
The final results will not be scaled, oriented or georeferenced. This means that no measurements can be taken, overlays can't be added, and results can't be compared to previous results. 
It possible that the results will produce an inverted 3D model in the rayCloud.
The 3D model will be shifted, this problem can be fixed using Manuel Tie Points.  

What is the quality report?
- A quality report is a summary of how the data will be processed and to what extent it will be processed. It informs the reader how many of the images were calibrated, the difference between initial and optimized internal camera parameters, the median matches per calibrated image, and the georeferencing used. The quality report also provide a preview of what the data will look like when complete. The report gives all the information that is needed to determine whether or not you want to proceed to the actual processing step. This step can take many hours to complete so it is very important to read the report and verify that the quality of the data is acceptable.

Methods
The first step taken in processing UAV data is to start a new project. The steps to do this are listed below.

-Start new porject under the project tap
-Name Project
-Choose Workspace
-Choose data that needs to be processed
-Verify that the correct images were uploaded with the proper coordinate system
-Choose a coordinate system for the final project to be processed in
-Choose the type of project that is being worked on

Once the parameters of the project are set up start the initial processing. This will take a bit of time, especially if it is a large amount of data. When the initial processing is done a quality report will appear. For more information of quality reports refer to the introduction. Figure 1 shows a screen shot of the summary from the quality report for this project. The gives the name of the project, the date and time that it was completed, the camera model, the average ground sampling distance, the area covered, and the time it took for the initial process to run.

Figure 1

(Figure 1 the summary of the quality report for the Litchfield Mine data)


Figure 2 shows the quality check, this is another feature in the quality report. The key things to notice in this section of the report is the green circles on the right side of figure 2. The green circles means that it met the requirements for further processing, the red and yellow triangles are warnings that these areas may not meet the requirements and may have a negative effect on the final product if not fixed. Normally the Camera Optimization should be below 5% however, Dr. Hupy gave the okay to process this data without it meeting the requirements. Due to the fact that this was a demo and quality of data was not the driving fact no ground control points were used. It is also important to note that all 68 of the images were calibrated properly, which will greatly improve the accuracy and quality of the data set. 
Figure 2
(Figure 2: image of the quality check summary form the quality report)

Figure 3 shows the amount of overlap that occurred during the process. The areas in green have very high overlap while the yellow, orange, and red have progressively worse overlap. The lack of overlap on the edges will distort those parts of the image. This is why it is good practice when collecting data with a UAV to collect data around the edges outside of the study area. This will preserve the quality of the actual study area. 

Figure 3
(Figure 3: the amount of overlaping images of the study area.)

Figure 4, which is the final image from the quality report, shows the methods used to create the DSM and Orthomosaic. In this case the Inverse Distance Weighting (IDW) method was used. It also gives the times that it will take for these datasets to be produced, the DSM will take 2 minutes and 24 seconds, while the Orthomosaic will take3 minutes and 34 seconds.

Figure 4
(Figure 4: Summary of the DSM and Orthomosaic)


 The next step is to run the Point Cloud and Mesh and the the DSM, Orthomosaic and Index processes. These processes can take very long depending on the amount of data that is being processed. These will provide the final product that can be used to measure volumetrics, distances, and a million other things. Figure 5 shows the orthomosaic of the Litchfield mine.

Figure 5
(Figure 5: Orthomosaic of the Litchfield Mine)

The detail of this image is so incredible that it is possible to clearly see the trucks that were parked at the site. 

Results
From figure 5 it is easy to measure the volume of objects. In figure 6 is a zoomed in image of a sand pile that was digitized and had the volume collected for it. To find the volume of this sand pile it is necessary to digitized around the object that needs to be measured. The better the digitizing the more accurate the results will be. The little green dots represent each digitized point used. Figure 7 shows the results of the volume measurements. 

Figure 6
(Figure 6: The digitized sand pile that was measured by volume.)

Figure 7
(Figure 7: the results of the volume measurement.)

The area that was measure was 624.68 meters squared with a total volume of 1253.03. The margin of error was 18.29 meters squared. Adding GCPs will greatly help to lower this margin of error in the measurements. 

The polyline feature was used to measure the distance of the front of a truck to the front of another truck. Figure 8 shows the two aforementioned trucks and the line that represents the distance that was measured. 
Figure 8
(Figure 8: Show two trucks and a line that measured the distance from the front of one truck to the front of the other.)


The distance of the line was 5.15 meters. The results table is shown in figure 9
Figure 9
(Figure 9: Measurement results form the polyline function)


One of the more powerful features of the Pix4D is its ability to format in the datasets so they can be used in other processing software. The DSM was transferred to ArcMaps and ArcScene. Here it can manipulated using every function in either of those Esri Programs. Two maps were created of the DSM, the first one is a the DSM that was displayed using Bilinear Interpolation with an inverted Red -Green Dark color scheme. ArcScene is nice because it really allows for the 3D to be noticeable. Figure 10 show the ArcScene 3D image of the DSM. The areas in red represent high elevation and green low elevation. 


Figure 10
(Figure 10: ArcScene 3D model of the Litchfield mine.)

Figure 11 represents the same DSM that was run through the hillshade tool in ArcMaps. The same techniques of Bilinear Interpolation were used in the displaying of this image. Also the same color ramp was used as well in Figure 11 as in Figure 10.
Figure 11
(Figure 11: A hillshade of the DSM created from Pix4D)

The amount of detail on this image is easily visible. In the center of the image tire tracks are visible. It is also easy to see the elevation change that occurs on the sand piles. This elevation change could be used to calculate flow rates of water after it rains.

The final portion of this project was to create a fly-by video of the DSM. A fly-by video is a computer generated video that "flies" around the processed imagery through a path that the user creates. The video created for this lab can be found below in video 1.

Video 1
  

Conclusion
Overall Pix4D is a very user friendly software that makes processing UAV data easy. The tools and functions that were used in this lab are straight forward and easy to comprehend. However, this lab hardly scratched the surface of the capabilities of that Pix4D has. It would be nice to have more time to work with UAV data and get a better understanding of the full capabilities available for this software. 

























Thursday, December 1, 2016

Surveying of Point Features Using Dual Frequency GPS

Introduction
     This lab used survey grade GPS to take elevation points of a small knoll in low campus of the University of Wisconsin - Eau Claire. Survey grade GPS is one of the best ways to gather location data in the field because of the incredible accuracy. Depending on the signal that the GPS unit receives it can be accurate up to the centimeter. The down side to survey grade is the expense of the units. The University only has access to one unit, so the entire class had to take turns using the unit gathering one point at a time. Figure 1 below shows the location of knoll that was surveyed. The area

(Figure 1: A map of lower campus of the University of  Wisconsin - Eau Claire. The small area in red represents the study area that this lab took place and is located in the middle of the campus mall.)

 in the northern part of the study area is flat and the land progressively rises in elevation the further south it goes. There are four benches that stretch west to east through the middle of the study area. For the most part it is a very open area with only a few small trees. This vegetation would not have effected to signal to the GPS unit. Below in figure 1.1 is an photo of the study area. It is easy to see in the photo the drastic change in elevation from a lower elevation in the left (north) side of the photo the right (south) side of the photo.

(Figure 1.1: A photo of the study area take the day after the data was collected. The photo was taken by the author of this blog. The photo is a real world representation of the elevation change that can be used to compare with the interpolation maps that are found later in this blog.)
Methods
     As mentioned briefly above the entire class ventured out to campus mall to gather elevation data points of the study area. Do to technical issues only 20 points were possible to gather the day that the class had access to the survey GPS unit. The random sample method was used to gather the data points. Each student picked a spot of their choosing and after the data point was gathered, they passed if off to the next student. When the data was collected it uploaded in a text format to a desktop computer. The text was converted to an excel file using the Get Data From Text function in excel. The headings were changed to match the respective data and the table was transferred to ArcMaps. The Display as XY data function was to display the data points. The coordinate system that was used was NAD 1983 UTM Zone 15N. The data was then exported to a geodatabase so that it could be edited and interpolated. The same interpolation methods from the Sandbox Survey lab were used in this lab, those being Spline, Inverse Distance Weight (IDW), Kriging, Natural Neighbor, and TIN. The interpolations were cut down to fit the study area by using the raster to mask tool. The data was displayed using the stretched symbology and set to Bilinear Interpolation to help improve the smoothness of the elevation maps.

Results
     The first map that was created was the Spline interpolation. This interpolation uses a mathematical function to estimate the surface by minimizing surface curvature in the hopes of creating a smoother finished product. Figure 2 below shows a depiction of the spline interpolation of the study area. The areas that are red represent areas of higher elevation while the areas in green represent areas of lower elevation. Unfortunately the map is not a very good representation of the actual landscape the areas in the northern part of the study areas should be rather flat with a rather drastic rise in the middle, followed by a drastic decrease in the southern portion of the study area. This pattern holds true for all of the interpolations. The spline interpolation does an excellent job of creating a smooth transition from one elevation to another. This smoothness is most noticeable in the middle of the study area where the patch of high elevation dips down to a lower elevation before rising again to an island of higher elevation. The other interpolation methods, which focus on different aspects of interpolating, lack this smoothness.
(Figure 2: The Spline Interpolation of the study area)

     The second interpolation method is the Inverse Distance Weighted (IDW) Method. The IDW uses a method of interpolation where it takes a sample of values surrounding a cell and gives a higher weight to those values that are closer to that cell in the sample. This done for every cell in the finished raster. The IDW interpolation gave the best results for the study area due to the fact that the highest point isn't to the southern portion of the study area. The highest point is in the center of the map which best reflects the actual landscape. It even gives shows a slightly higher elevation if the middle eastern portion of the map which would represent the knoll. Figure 3 below shows a map of the IDW interpolation.

(Figure 3: The IDW interpolation of the study area)

     The third interpolation is the Kriging which uses the z-scores of the data points to create a surface model. While the Kriging looks very nice and really amplifies the hill shade effect. It did a rather poor job of recreating the study areas landscape. The Kriging method was unable to detect the knoll in the center of the study area. It also amplifies the mistakes that were made in the random sampling of the surface area. Figure 4 below shows a representation of the Kriging interpolation.

(Figure 4: The Kriging Interpolation of the Study Area)

     The fourth interpolation is the Natural Neighbor method. The Natural Neighbor method uses a subset of data points surrounding an area and weights them based on the proportion of surrounding areas. This method is by far the most difficult method to understand. It is unclear why the Natural Neighbor method did not completely fill the study area like the previous three methods. It potentially have been due to lack a sufficient amount of data points to properly interpolate the rest of the study area or an error could have occurred during the interpolation process. However, when examining the area that was interpolated it is easy to see that this method was not a successful as the IDW and Spline methods. It does show a fair representation of the western portion of the knoll but was completely unable to detect the eastern portion of the knoll. It also does a decent job of showing that the southern most portion of the study area has a slightly higher elevation than the northern most portion. Figure 5 below shows a representation of the Natural Neighbor interpolation method.

(Figure 5: The Natural Neighbor Interpolation of the Study Area)

     The final interpolation method is the Triangular Irregular Network, more commonly known as TIN. TINs can be created from feature classes that contain elevation data or from a raster that contains elevation data. The TIN ran into the same problem as the Natural Neighbor method where it was unable to fulling interpolate the study area. The same possibilities for why it failed to completely interpolate that were mentioned for the Natural Neighbor method apply for the TIN as well. The TIN provided a poor representation of the study area. It was completely unable to detect any raised elevation in the eastern portion of the study area. Figure 6 below shows a representation of the TIN interpolation.

(Figure 6: The TIN Interpolation of the Study Area)

Conclusion
     Overall none of the interpolations did a great job of representing the study area. This is mostly contributed to the lack of data points collected in the eastern middle portion of the study area. A few contributing factors to the lack of sufficient data points is first the that due to licencing issues that were outside of the classes control their was a limit to the amount of data that could be collected. This factor could have easily been worked around by choosing a different method other than random sampling. The class was rather unorganized in the collecting of the data points and no one noticed that there were no data points collected on the eastern middle edge of the map. This was area was a very important part of the landscape because it is where the eastern half of the knoll is located.

Sources
     For more information on the different methods of interpolation check out the two links below;
- For Spline, IDW, Kriging, and Natural Neighbor click here.
- For Triangular Irregular Network (TIN) click here.








Tuesday, November 29, 2016

ArcCollector 2

Introduction
     This lab is a continuation on the development of ArcCollector skills. The previous lab had a pre-created database set up prior to the beginning of the lab. This weeks lab answers a spatial question created by each individual student. This particular blog investigates where houses are located that put up Holiday lights early and how many different colored lights were used per house.

Study Area
     The study area that was chosen for this lab was a two block section to the east of the University of Wisconsin - Eau Claire campus. This area is part of the downtown area of Eau Claire and consist of various old and nicer house. It was believed that the owner of these houses would have been more likely to put up Holiday lights earlier.

Methods
     The first step in answering the study question was to create a geodatabase with specific domains to help normalize the data collected in the field. The project in this lab had two different domains, the first was created for the quantitative attribute, the number of different colors used in Holiday decoration per house, that created a range between 0 and 15. It was very unlikely that a house had 15 different colored lights but it was better to guess a little high than not high enough. The other domain was for the qualitative attribute, which was the amount of lights used for each house. This was broken down into 4 categories Lights Everywhere, They Tried, Meh Holidays, and No Lights. Lights Everywhere represented the most lights and the further down the list the less lights the house had. With the domains created a point feature class could be made. The proceeding step was to create a polygon of the study area. This is helpful for when the data collector is in the field. They can easily see on there tablet of smart phone whether or not they are within the study area. With the two feature classes created they could be uploaded to ArcGIS online where they can be added to an online map that can be accessed remotely from a smart phone or tablet.
       In the field a GPS point was taken at every house in the study area and the number of lights per house and the amount of lights per house was collected. This data was then uploaded back to ArcGIS Online. From here an embedded map was created of the data. Embedded map is an online map that can be manipulated by the viewer, Google maps is the most famous embedded map. The embedded map below depicts a hotspot map of the quantitative variable. Another function that can be down with the data in ArcGIS online is to download it to a desktop version of ArcMaps where it can be cleaned and manipulated into static maps, or non-interactive maps.

Results
     Unfortunately the results where a little disheartening. The majority of the houses in the study area did not yet have Holiday lights set up yet. However the ones that did have lights set up tended to have a larger variety of colors. Figure 1 below shows the amount of lights per house in the form of an embedded heat map. Areas that have higher colors are areas with larger amounts of Holiday lights. Interestingly the areas further away from campus tended to have more lights than the areas closer to campus.

Conclusion
     This lab really stresses the importance of properly setting up a project before heading out into the field. It is necessary to think through all aspects of the project beforehand and what would be the best way to address these aspects.



    


Tuesday, November 15, 2016

ArcCollecter and Micro Climates

Introduction
     This lab used ArcCollecter to gather data remotely in the field and transfer it into the Esri online web service. This method of data collection is exceptionally powerful because it allows for multiple people to gather data in various locations at the same time and upload it to the same online document. This lab looks at the micro climate for the University of Wisconsin - Eau Claire, specifically the temperature, dew point, wind speed, and wind direction. For this lab the class was broken up into 9 different groups and assigned different locations around campus where the groups gathered the weather attributes. Figure 1 below shows the campus of the University of Wisconsin - Eau Claire broken up into 5 different zones. The author of this post collected data from zone three. The terrain for zone 3 was a mix of woods, bog, and
 (Figure 1: A map of the 5 different zones that were mapped out for the micro climate lab)
the campus itself. The area in the eastern half of zone three is woods that has a small creek, called the Little Niagara, traveling threw it, Little Niagara travels all the way threw zone 3 then cuts north through zone 4 until it hits the Chippewa River. Certain areas around the creek have turned into a bog or wetlands, these terrains are solely located in the eastern half of zone 3. The western half of zone 3 contains the campus science building, the student union, and the nursing building. The terrain around these buildings is well kept with trees planted artistically around the buildings. Other notable terrains in the study area include the area between zones 5 and 3/4, which is a wooded hill with a very drastic elevation change. The southern portion of zone 1 is the flood plains and also contains a slight elevation change from the water to inland. The northern portion of zone 5 is Putnum park, Putnum Park is on a cliff that overlooks the Chippewa River and is very wooded. Little Niagara discharges into the Chippewa in Putnum Park.
   

Methods
     In order to measure the different weather attributes around the Eau Claire campus different tools were used. The main tool to measure the weather was a Kestrel weather meter, which is able to measure the wind speed, temperature, and dew point, to name a few of the functions. A compass was used to find the direction that the wind was blowing in degrees. The final tool was a smart phone with GPS capabilities that had the ArcCollecter app downloaded. This app allow the user to access their esri online profile from their phone and collect GPS data points.
     Every group had a smart phone and were assigned a zone in which to collect the weather measurements. These points would then be uploaded to a pre-created basemap of the area that was provided by the professor, Dr. Joe Hupy. After all the points were collected each person was able to download the online maps to a desktop version of ArcMaps and use the field data. This lab called for surface maps of every attribute mentioned above. The inverse distance weighted interpolation method was used to create a surface maps of the attributes collected. Wind direction was a special case, instead of creating a surface map the GPS symbols were turned to arrows and rotated to face the direction that the wind was blowing. These arrows were overlayed on the wind speed surface map.

Results
     Below are all of the surface maps that were created. The first image, Figure 1, shows surface temperature map for the Eau Claire campus. This map is particularly interesting due to the patterns that formed. The Pink blobs in the south eastern portion of the map represent hot spots where the temperature was between 63 and 64 degrees Fahrenheit. These three building represent the where the student union, science building, and nursing buildings are located. Another hot spot is in the north which represents the Eau Claire arts center. The green areas to the east of there represent cold spots. These cold spots represent the wetlands in the woods where sunlight was reduced, lowering the temperature.
(Figure 2: Surface Temperature Map of the Eau Claire campus)
 
     Figure 3 show the dew point surface map. This map shows an obvious pattern where a diagonal area of higher dew point separates 2 different areas of lower dew points. This is caused, in part, by the Little Niagara which follow the path of high dew points. Also the forest ridges follows this same path and can also be a cause the higher dew points.

  (Figure 3: Surface Dew Point Map for the Eau Claire campus)
 
     The final map is the wind direction and wind speed map, which was briefly explained above. Figure 4 is an image of this map. The wooded ridge is a huge factor to wind speed and direction. There is a green path, which represents wind speeds between 0 to 2.2 mph, that follows the ridge line. The areas above and below have significantly higher wind speeds than the areas around the ridge. There are lower wind speeds in the north eastern corner as well. This could due to the surrounding houses and trees blocking the wind.

(Figure 4: Wind speed and wind direction surface map of the Eau Claire campus)

Conclusion
     This project was an excellent success. The data transfer from every ones phones to the online Esri mapping site went very smoothly. Having multiple people collecting data points simultaneously saved a considerable amount of time and was overall more efficient. ArcCollecter is an amazing resource that is significantly cheaper and more accurate than most GPS units. Also it makes getting the data from the field to ArcMaps quick and easy. It would be unsurprising if smart phones and ArcCollecter replace most GPS units in the very near future. 


Tuesday, November 8, 2016

Navigation Courses at the Priory

Introduction
     The latest project for field methods is a continuation of the previous lab titled Navigation Maps. Navigation Maps was a lab where two maps of the Priory were created that would be used to navigate through the Priory to find certain location marks. This lab used the navigation maps of the Priory to locate 5 markers that were scattered throughout the Priory. Along with the maps a combination of pace counts and compasses were used to help with the navigation process. The Priory contains 5 different sets of marks, also known as a course, for groups to find, however do to the overly large class size this semester there was a 6th group. Group 6 was assigned markers from multiple courses.

Methods
     The method used to navigate threw the priory was rather simple. First the coordinates for the five markers were given to each group. The groups then marked the location of the markers on their maps to use be used to locate the course markers. Next, a pace count was needed to be created for every group member. A pace count is the amount of paces it takes for a person to walk a given distance, for this exercise meters were the units used for distance. To find the pace count a 50 meter tape measure was laid on the ground and a student would walk next to it while counting their steps. Once the student walked the entirety of the tape they could double their amount of steps or turn around and walk back. The pace counts that were used in group 4 were 110 paces per 100 meters for Payden and 135 paces per 100 meters for Sarah. The third group member had recently broken his leg and was unable to put to much stress on it. So he ended up being the note taker during the exercise. The next tool used to help navigate the Priory was a compass. Each compass was able to locate north and find the direction in degrees from north that the next marker is suppose to be located. The compass also had a 5 cm ruler on the side of it. The ruler was used to measure the distance on the map from one marker to the next. The centimeters would then be converted from map distance to actual distance using the map scale. The last piece of equipment used was a GPS unit. The GPS unit recorded the path of each group through the woods and showed how close to the markers each group got.
     To give an example of how all of this worked lets say that the starting point was in the parking lot and the first point was 11 degrees west of north. The distance from the starting point to the first marker is 7 cm. The scale on the map says that 1 cm = 31.34 meters, so 7 * 31.34 = 219.38 m. If Payden, who has a pace count of 110 pace per 100 meters, was pace counter then he would have to walk (110 paces * 2.1938 = 241.318 paces) 241.318 paces 11 degress west of north to find the first marker. While Payden walked Sarah would stand at the starting point with the compass watching to make sure Payden didn't deviate from the 11 degrees west of north.

Discussion
     Overall the exercise did not go as planned, Group 4 was only able to find 2 of the 5 markers. There are many reasons as to why this exercise was unsuccessful, the main one was the terrain of the Priory. The Priory is a 120 acre rectangle that is filled with large ravines, dense brush, thorns bushes well over 7 feet tall, and drastic elevation change. All of these less than ideal navigation obstacles made the traveling through the Priory very difficult. It was very easy for the pace counter to get off course walking around thorn bushes or thick trees. It was also equally difficult for the compass watcher to keep eyes on the pace counter for more than 20 to 25 meters at a time. Also the maps that were created used a 50 meter grid pattern to navigate with, this turned out to be two course of a grid pattern. When the locations of each marker was marked on the map there was a reasonable chance that the point could have been off by 20 meters or more. 20 meters is a significant distance when the forest was as dense as it was. Also the large changes in elevation made it especially difficult for the pace counter to measure the distance that was walked. The distance from one point to another on the maps is as a bird flies. The pace counter has to walk up and down different inclines that add numerous extra paces that threw off the distances. The final difficulty that was encountered was vandalism whether by animals or people certain markers were ripped off of trees. Marker number 2 was ripped off of the tree and laying a few meters away from it. Image 1 below shows this. The marker is pink tap that is laying next to the bottom part of the tree. Image 2 shows a tree with an intact marker, also featured below.
(Image 1: Marker ripped off of a tree)

(Image 2: Intact Marker)

     To give an overall idea of how unsuccessful group 4 was at locating the markers, due to the variety of issues and obstacles that they ran into, a map was created that showed the location of all 5 of the markers and the path that group 4 took based off of the GPS unit that they carried. Figure 1 below shows this map. 
(Figure 1: The Path of Group 4 in relation to the actual location of each marker.)

     The data received by the GPS unit is not 100% accurate due to the large amount of canopy coverage by the trees. However it is easy to see that Group 4 never got close to points 3, 4, or 5 in the north. This is largely due to the fact that when marker two was ripped down they were unsure of where exactly to start when trying to find the third marker. So they missed all the rest by quite a lot.  
Figure 2 below shows a map that contains all 6 groups and the the courses that they did. 
(Figure 2: All groups and courses, color coded.)

    
Conclusion
    This was a fun albeit difficult exercise that really demonstrated that if one part goes wrong the whole navigation can go wrong. It is better to simply retrace your steps back to a previous recognized point and start over from there. Also make sure that the grid used is an appropriate amount for the area that is being navigated. The Priory really needs a grid that is 25 meters or smaller. It makes it significantly easier to navigate.  










   

Monday, October 31, 2016

Navigation Maps

Introduction
     The lab for this week consisted of creating two navigation maps for the Priory. The Priory is a 120 acres patch of land a couple of minutes south of the University of Wisconsin - Eau Claire. The maps created for this lab will be used to navigate these 120 acres of woods. The first map uses a Projected coordinate system and the other uses a geographical coordinate system.

Methods
     The Professor of the class obtained the data used to create this map from the United States Geological Survey or more commonly known as, USGS. The students of the class used this data too create two different maps. The first that was created was the one using the projected coordinate system. The projection used is the Universal Transverse Mercator (UTM) Zone 15. The UTM is a world wide projection system that breaks the world up into 60 zones. By doing this each zone experiences less distortion. This system uses meters for its linear unit. To create the map an areal image of the Priory was added. This was added to help identify features in the real world while conducting the navigation exercise. The second layer contains contour lines, which indicate the change in elevation of the land. These contours were originally given in 2 foot intervals but were switched to 2 meter intervals for two reasons. The first being that the linear units of the projection is meters and it is appropriate to have the contours units to match the projection units to improve the accuracy of the contours. The second reason is that the 2 ft. intervals were too detailed and it made reading the map difficult. The 2 meter intervals spread out the contour lines making the map easier to read. Another feature that was added to this map was an outline of the navigation area that is being used. This help to keep the students within the study area, hopefully. The final and most difficult part of this exercise was creating a grid that overlayed the entire map. The challenge of the grid was finding the right amount of detail to add while not cluttering the map too much. For the projection map a 50 meter interval was used to separate the grid cells.
     The second map that was created used the GCS North American 1983 geographic coordinate system(GCS). This coordinate system is designed to represent the North American and Pacific plates.  All of the same steps used in the first map were applied to this map. However the linear units of this GCS were decimal degrees. So the grid had to be converted to decimal degrees, with an interval of .001 decimal degrees separating the cells. Both maps contain a north arrow, scale bar, a water mark, which is simply the name of the creator of the map, the source of the data, and the coordinate systems used.

Discussion
      The projection map is posted below as figure 1. The projection map will be the easier and more useful of the two maps. Due to the projected system the features are more accurate and experience less distortion. Both maps use white contour lines because of the nature of the areal imagery. Basically the color white makes the lines easier to see than black.
(Figure 1: an image of the map using the UTM Projection for the Priory.)

     The second map is the one that uses the geographic coordinate system and is portrayed in figure 2. When comparing the two maps it is easy to see that the second is more stretched out and narrower than the project map. This distortion will make navigating slightly more difficult. 

(Figure 2: an Map of the Priory with a geographic coordinate system.)



Tuesday, October 25, 2016

Survey Using Distance and Azimuth

Introduction
     The point of this lab is to use survey equipment to find exact location of tress in real life. This method of Distance/Azimuth surveying is an excellent way to survey the locations of objects in real life without using GPS units. It is also a good back up plan for when equipment fails in the field. The study area for this project was in Putnam park, a park that runs through the University of Wisconsin - Eau Claire lower campus.

Methods
     Equipment
     - Laser Distance Finder
     - Distance Finder
     - Compass
     - GPS Unit
     - Notebook and Pencil

     The compass is a device that is able to find North and can measure the azimuth from north. The Laser Distance Finder is the most high tech device that was used and it can measure the distance that something is away from the user, the azimuth of that object, as well as variety of other things. The distance finder is a two part device that contains a transmitter and a receiver. The transmitter sends out a signal that is received by the receiver. The receiver is able to measure the distance that the signal traveled. The notebook and pencil were used to take notes in the field.
     The Distance/Azimuth survey method is a method that is very convenient when their is a lack of advanced technology. It is a simple matter of setting up a survey area. Then using a simple GPS unit or a smart phone, if their is service, to get the location of the origin point. When the origin point is set find north. North will be the starting point, or 0 degrees, in measuring the Azimuth. Next step is to find the location of the object that needs to be mapped. Use the compass or laser distance finder to get the Azimuth or the degrees away from north that the object is. The next step is to find the distance from the origin point that the object is by using a laser distance finder or a distance finder. With the latitude and longitude of the origin point, the azimuth and the distance of the object from the origin point it is possible to create a map of the tree locations without GPS. 10 tree locations were taken from three different origin points. The point to the furthest north and the middle origin points used the Laser Distance Finder to find the location of the 10 trees, while the last location used the compass and distance finder to get the location of the trees. The attributes that were collected include the origin latitude and longitude, the distance from the origin point each tree was in meters, the azimuth of each tree in degress, the diameter at breast height (DBH) in centimeters, and the tree type. A portion of the data is displayed in figure 1 below.
(Figure 1: A table containing all of the attribute data needed to map out the locations of trees in Putnam Park.)

      The data was collected in three different groups and compiled into a google docs spread sheet. This data was then download and imported into ArcMaps. The Bearing Distance to Line tool was used to create a feature class of lines from the origin point to the location of the trees. The other tool that was used was the Feature Vertices to Point tool. This created a point at the end of each line that represented the location of the trees. Figure 2 below is an image of the data flow model that was used to create the features used in the map.

(Figure 2: A data flow model that shows the progression of the data from the original google docs table through both the Bearing Distance to Line tool and the Feature Vertices to Points tool.)

Results
     There was one major issue that occurred in this project. The second origins latitude was incorrect and the features appeared in the parking lot 20 meters north of the actual location. This problem was remedied by trial and error. The latitudes were lowered slightly to try and get the features in the proper location. Figure 3 is an image of a map that show the incorrect positioning of the second origin point.
(Figure 3: A map of the incorrect positioning of the second origin points group of trees.)

This map shows a group of points and lines north of the park and on the edge of the parking lot. The actual location was in the middle of the other two groups of lines and points. The reason that the group of trees for the second origin point were not in the correct location could be due to a few reasons. The most likely is the ridge that forms to the south of this study area affecting the GPS device that was used. Which is another reason why this method is so important, the only technical device that was used failed one out of three times. Another possibility is that the latitude was marked down incorrectly. The second option is less likely due to the fact that numerous different people all wrote down the same latitudes and longitudes. Figure 4 below shows an image of the same area but with the second origin point with a more accurate latitude. 
(Figure 4: An image of the study area with a corrected origin point for the second origin point.)
Unfortunately even with the better location of the second group it is still not accurate. The longest line going in a northern direction should be to the right of the path going through the woods not to the left. So while this method can give a general idea of where objects are located relative to other objects, it is subject to the accuracy of the origin point. Other factors that may lead to inaccurate results include human error. An improper read of one of the devices will lead to wrong results. 

Conclusion
   Overall this method is excellent if a representation of where thing are in relation to other things is all that is needed. However if absolute accuracy is needed in measuring the location of objects this method is subject to many types of errors or inconsistencies that will through off the results but can be useful when in a bind.