Sunday, March 23, 2014

Field Activity #7: Introduction to UAV's

This week the class was introduced to different types of UAV's carrying cameras.  The goal of this week was to introduce and familiarize the class with using different types of UAV's to capture images.  2 drones, 1 kite, and 1 rocket were used to demonstrate how they work.  Dr. Hupy and his friend Max displayed their drones to class demonstrating how they fly and capture images.  Also Dr. Hupy used his personal touch by creating devices on a kite and rocket to capture aerial images.  Below will feature images of the different types of UAV's that were used to demonstrate to the class.


Figure 1: Dr. Hupy's drone next to the remote which controlled it.  This drone had a flight time of 15 minutes, meaning it could not travel to far away from the person controlling it. 



Figure 2: This is the camera attached to drone in figure 2, it was set to take pictures every 5 seconds.  The camera was placed on a device that keeps the camera always facing down and steady. 

Figure 3: Max is preparing to fly the first drone, roto copter.  Both drones are linked to a GPS.  The copter will automatically go back to the place it took off from if anything goes wrong.  This is very important in case the copter travels to far a way or there is a malfunction in the device.
Figure 4: This is an image of Max's drone which he created.  It uses six propellers compared to figure 1 having 3.  This drone also has a flight time of 15 minutes and is linked to a GPS.  This drone seemed to be more steady compared to the drone is figure 1, maybe because of the six propellers.

After both of the roto copters were brought into the air they had to be calibrated.  The calibration took about 3 minutes and is key into getting the copter to cooperate with the remote and fly straight and steady.

Figure 5: This is an image of the kite our class used to capture images.  The lines you can see coming off the string is a camera help up by a device that keeps the camera steady in the wind.  The kite is a less expensive way of capturing aerial images, however the wind needs to be in ideal conditions to be in use. 

Figure 6: Classmate, Blake is handling the kite with care letting it capture images every five seconds.  The camera can be set at any intervals, this day it was set to take a image every five seconds.

Figure 7: Two cameras were attached to this rocket which was launched into the air by an electrical circuit.  Only one of the engines was fired which cut out the flight time.  This technique was the least successful of three because of the short flight time and failure to launch properly.  This rocket was designed by Dr. Hupy and it will continue to be edited and made more efficient. 

Sunday, March 9, 2014

Field Activity #6: Microclimate Geodatabase Construction for deployment to ArcPad

Part 1

This week in class we learned how to develop a geodatabase domain and then create a feature class out in our personal geodatabase later to be used next week when collecting data using a GPS. The task of the week was to decide what belongs in the feature class, creating the feature class, and then importing it along with a raster background image into ArcMap.  

The first step of the process was deciding what fields were going to be created inside the features class.  As a class we followed these questions to come up with our data: What is the purpose of the event?  What are you trying to examine?  What are the ranges you are going to be looking at?  What is the type of data you will be recording?  Pre-planning for what goes into a geodatabase when collecting data in a field is very important.  Because when out in the field you want to have everything set up perfectly to collect data.  For example when collecting a survey of temperature you could run into a technology problem.  If the GPS or temperature gage is failing it is important to enter into your notes field that problems occurred, therefore when looking back at the collected data it will be easy to remember what went wrong. Staying one step ahead when out in the field is critical in having a clean and efficient way of collecting data. More information of what to add to the geodatabase can be found here.

Next week the class will be collecting temperature data around University of Wisconsin Eau Claire's campus mall. Therefore the fields used in the collection will all relate to components of temperature.  The fields the class decided on were: temperature, wind speed, wind direction (azimuth in terms of degrees and direction; North, South etc), relative humidity, dew point, snow depth, time, group number, and notes.  All of these fields contribute to temperature and it is important to gather the data their information when displaying our results.  The wind speed and direction is critical to the temperature of the air.  Faster winder speed leads to colder temperatures and the direction of the wind will lead to warmer or colder coming into Eau Claire. If the wind is blowing to the south this means that the wind will be coming from Canada leading to colder temps in Eau Claire.  Also relative humidity and dew point are other important factors to the temperature.  If the air is very humid the air will be thicker and hotter compared to when the relative humidity being low the air will be thin and cooler.  Time will have also have a great effect on the temperature, the coldest temperatures of the day usually occur around 6 am and the warmest around 2 or 3 pm.  This will be important because groups may collect their data at different times of the day.   Snow depth was a personal choice for the class and does not relate the current temperate.  The notes field will be used to take any important notes during the collection.  Things like, "standing next to a building heater" or "the wind was blocked by buildings" will be important when reading the results of the data. 

 

Part 2

The next step after pre determine what will go into our personal geodatabase is to create the geodatabase.  To do this open up ArcCatalog and connect to the folder you wish to use when creating the goedatabase. To do this click the connect to folder button seen in figure 1.
 
Figure 1:The connect to folder button is shown in the image above.  It is the folder
with the plus sign in the corner.  This will let you connect to the folder you wish
to create the geodatabase in. 
 
Then after connecting to the folder desired, right click on it and choose new personal goedatabase.  This will create a geodatabse and the name of the geodatabase can be edited.
 

Figure 2: In ArcCatalog find the folder you wish to create the geodatabase and right click and
choose new personal geodatabse.  The new geodatabse should appear under the folder like
the image above shows the geodatabse mc_borgen_gdb

The next step is to edit the domain to set up the feature class.  This step is very important because you be setting the range of the domain along with the field type, short integer, float, or text.  Temperature, dew point, and relative humidity were set to float.  Notes and wind direction using north, south, east and west were set to text and the rest were set to short integers.  To see the domain right click on your personal geodatabase and click on properties and then domains.  Figure 3, should be seen on the screen and the domain is ready to be edited. 
Figure 3: When editing the domain this image should appear on the screen.
 You can set the domain name, field type, and range.

After editing the domain the next step is to create the feature class containing the different fields.  To do this right click on the geodatabase then followed by new and feature class.  The steps can be seen in figure 4 below.  Next set the class to a point feature class and choose a coordinate system seen in figure 5.  For this exercise UTM Zone 15N was used because Eau Claire fall within that zone.  More about UTM zones can be found here. 

Figure: 4: To create a feature class right click on your database followed by new and feature class.
The figure above should lead the way


Figure 5: For our study area the class used NAD1983 UTM Zone 15N.
The city of Eau Claire is falls in Zone 15.  

After choosing the coordinate system click next until you reach the image similar to figure 6.  Here you can edit the field name choose the data type and the domain type.  In the field name enter in each separate field, temp, dew point, wind speed, etc.  Then choose either float, short integer, or text for the data type and match it the domain entered earlier.  After entering the field name, data type, and matching it to the correct domain the feature class can be finished. 

Figure 6: An image similar to this should appear when creating a new feature class.
The field name, data type, and domain will have to be edited to create the feature class.

Also a background image should be imported into ArcMap for use when importing the collected data next week.  To do this follow the steps in figure 7 and import a raster image desired.   After the raster image is loaded into the geodatabase it, along with the feature class, is ready to imported into ArcMap.

Figure 7: Importing an raster image is rather simple, right click on the geodatabase,
then import, and raster datasets.  Then importing finding the raster desired
and placing it in the correct output folder it will appear in the geodatabase

To do this open ArcMap and click the add data button, which can be seen highlighted in figure 8.  Then connect to your geodatabase you created in ArcCatalog by clicking the connect to folder button seen earlier in figure 1.  Navigate your folder and add the feature class and raster image to ArcMap.

Figure 8: This figure is showing the box that appears when adding data.
The add data is the same button in figure 1, then by connecting the
folder containing the geodatabase add the data you want on your map.

If these steps are followed correctly your screen should be similar to figure 9 below.  The raster image appearing and the feature class and raster details appearing in the data frame.  You have now prepared a geodatabase with the correct fields ready to collect data and enter it on to ArcMap.

Figure 9: The image is the final result of creating the geodatabse, editing the domain,
creating the feature class, and adding the data to ArcMap.  

Sunday, March 2, 2014

Field Activity #5: Development of a Field Navigation Map and Learning distance/bearing Navigation.

Introduction 

This week involved gaining knowledge of how to navigate on our feet using only a compass and map.  In the upcoming weeks our class will be assigned to navigate a point course set up by our professor Joe Hupy and Al Wiberg, a instuctor at UWEC environmental center.  It involved finding our walking pace, using a compass to find the azimuth a destination, and prepparing a grid map on ArcMap of the UWEC priory's location. 

Methods
 
The first objective of the week was to find our walking pace.  To do this the class measured out a straightline 100 meters and walked down the line counting every other step.  This was done twice counting steps on the way there and the way back and taking the average of the two.  Find the walking pace will be very useful when navigating the point course at the priory.  My walking pace ended up being 61 paces and therefore knowing that my pace will be 61 steps for every 100 meters it will help when navigating the distance between points. 
 
The next step of gaining knowledge on how to navigate using distance bearing, is knowing how to use a compass.  Al Wiberg taught the class how to find the azimuth reading using a map.  When using a map and a compass, in order to get from one place to another the compass must be place on a straight line from one point to the other.  Then using a pencil draw a line between the two dots and face the compass north.  Once it is completly straight witht the north arrow point straight north and the azimuth will be known.  Then by holding the compass directly infront of the chest, the compass can be read to find the straight line distance to the point.   However, when navigating the magnetic declination should be taken into account.  The magnetic declination is the force the magnetic pole has on the compass and the compass should be rotated a certain amount of degrees, depending on your location, to account for the declination.  

The last objective of the week was to create a grid map of the priory area made by Joe and Al.  This will be the classes area of study when trying to navigate using compass and map.  The first step was to pick background images of what to map and use two different coordinate systems.  The first map I created used a Universal Tranverse Mercator (UTM).  UTM projectes the world by dividng sections in 6 degrees of the Northern and Southern Hemisphere.  Eau Claire falls into UTM zone 15, which was used by me in the first map as you can see in figure 3.  The next coordinate system used for the second map was the state plane coordinate system for central Wisconsin.  The projection divides each state into sections and in this case Eau Claire falls in the central plane system.  The UTM used meters when alligning the grid and the state plane system used decimal degrees to allign the grid. 

The next step to create the grids was to choose a images of the priory.  Figure 2 below shows the background image I choose to grid. 

Figure 1: The square outline in black is the area of the priory
the red lines represent elevation and represeting them by using contour
lines.  The aerial image was taken from USGS and is of Eau Claire.
 
Figure 2: I choose to use the countour lines to show the elevation
that our class will encounter when navigating.  Each line represetns
5 meters of elevation. 
 
 
After choosing the background images the next step was to set up a grid systems.  This can be done by following view>data frame properties>grid>new grid.  The first grid map, figure 3, was set up by ever 50 meters and using the UTM coordinate system.  The grid was then edited to format the labels and intervals which can be done in grid properties.
 
Figure 3:  Grid map of thep priory area in Eau Claire.
This area will be navigated by our class in the upcoming weeks using
a compass and a map.  The map can be edit by choosing properties and
extended properties, for the label and intervals.
 
 
Figure 4: This image is showing the main
menu to create a grid
 
 
Figure 5: The next map using the central state system for Wisconsin and was mapped in decimal degrees. 
 
Figure 5 : Very similar to figure 3 but maped in decimal degrees
and using the central state system for Wiscons.
 
 
 
 
Disscussion/ Conclusion
 
Taking the first steps in learning how to navigate using a compass and map to find the bearing distance between two points is going to be very helpful when navigating the field.  This is also going to be helpful in the real world if I ever get lost and need to find my way to safety.  I am looking forward to navigating with my group to see how successful our skills will be.  This is an important step for a geographer, instead of just using a GPS to naviagate and realying on our critial thinking skills to find our way, because "technology will fail you". 


Sunday, February 23, 2014

Field Activity #4: Conducting a Distance Azimuth Survey.

Introduction


The fourth assignment in Geography of Field Methods was to conduct a distance azimuth survey using different geographic techniques.  The assignment called for students to find an open area where conducting many survey points could be easy to do.  These survey points were to be taken by a compass and a laser device to gather the distance and azimuth also, every point should have have some type of information attached with them.  After the points were gathered the next step is to enter them into an excel file and import them into ArcMap.  Then the bearing distance to line tool was used to plot the data on ArcMap to view the results of the points taken.   This activity was designed to have the class become familiar with taking distance and azimuth points using the laser device and a compass.  

Study Area

A big part of the assignment before any points could be taken, was picking a study area to conduct the survey.  The point on which to take points must be relatively open with no trees, poles, or any items that would block the image when looking at it from above.  The area must also have a good amount of features  to gather points, for example: trees, poles, garbage cans, or any item that is can be easily seen from a distance away.  Our group, Drew, Andrew, and I, choose our base point on the campus of University of Wisconsin Eau-Clair, in between Phillips and Schneider Hall and mapped points towards and away from campus.  However, after not running out of features to collect we decided to move into the campus mall to find more features to collect.  A total of 100 points were taken with three different base points.  

Figure 1: This image is showing 2 out 3 base points we used when collecting data.  The
3rd point can be seen in images below.  This aerial image of UWEC is very easy to use
because there are few trees.
Methods
Points were collected using two different instruments: a compass which determines azimuth and a laser device which determines distance, in meters, and azimuth.  Both techniques will be used when collecting points.  Just mentioned earlier the first step when conducting a distance azimuth survey is to find an area easy visible from an aerial photo but with many features to conduct the survey.  After finding the base point the next step is to complete the survey.  Our team of 3 decided to survey different types of features, trees, poles, garbage cans, bike racks, and tables.  Andrew used the laser device to collect points and Drew and I switched off collecting points with the compass and writing down the distance and azimuth of each point.  It is important to have two ways of collecting data in case one fails you, usually the technology.  As our course instructor, Dr. Joe Hupy says, "Technology will always fail you", for this reason it was important to collect the azimuth of the point with both the laser device and compass. 

Figure 2: This is an image of drew collecting data with the lazer device from our first base point.
By simply pressing a button the azimuth and distance are given on the screen of the device. 

Figure 3: It is very importing to remain in the same spot when collecting points.  Utilizing the snow we
made foot imprints to know where to stand each when surveying points. 

As each point was collected they were put into our notebook into four categories: number, azimuth, distance, and type.  The collection was done fairly quickly once the group got into a rhythm and really started to gather points.  We did have some troubles remembering which tree or pole we collected but it was corrected when tracing our steps backwards.  After collecting 50 points in between Phillips and Schneider hall we decided to go on to campus mall to collect 25 points at two locations.  This was done because the assignment called for 100 points to be collected.  The assignment also had us collect points with both the lazer and the compass.  For each point the lazer and compass were used and the two were compared, which can be seen later in this blog. 

Figure 4: The notebook which we used to combine our data.  It was sorted into to 4
different groups: Number, Distance, Azimuth and Type. 
After collecting the data the next step is to enter the data into an excel sheet and import it into ArcMap.  The four categories containing 100 points was entered into an excel sheet and six decimals were attached to the numbers.  It is important to attach six decimal points other wise for some reason ArcMap will not be able to use the data and the results will not be able to be seen.  Along with entering the data into excel the latitude and longitude of the three base points must be found.  Drew, group member, did this by using an app on his phone that collects an lat long point by just clicking a button.  Our first base point had an X, Y (latitude, longitude) of 44.79769, -91.499 as you can see in figure 5 below.

Figure 5: This is a portion of the excel sheet before it was entered into ArcMap.  6 decimal points were
used when importing it into to ArcMap. 

Magnetic declination is the angle between the magnetic north and the true north.  The compass will point to the magnetic north leaving room for error when collecting points.  NOAA has application  an that will calculate the degree of declination for any location.  Eau Claire the degree of declination is 1.36 degrees west (negative).  1.36 degrees was subtracted from ever azimuth collected form the laser and compass.

The next step of the assignment was to import the excel file into ArcMap and display it as a map.  First a geodatabase was created to store all the files that were going to be created and then a basemap was imported from USGS to show an image of UWEC's campus in 2013.  Next tools were used to create point and lines of the 100 different points collected.  The bearing distance to line tool was used, which can be found in Data Management>Features>Bearing distance to line, to give use a line from the base point to the point surveyed.  This tool was used three different times because we used three different base points.  The bearding distance to line tool would error when trying to only use one excel sheet instead of three because of the three different X, Y coordinates.   Also the two was ran twice for each point, once to get points from the lazer and points from the compass.  After the bearing distance to line box was filled out correctly the tool was completed and lines appeared on our map.  

Figure 6: this image is an example of the first two base points after using the bearing distance to line tool.
The yellow and gray lines are compass points and the red and purple lines are the lazer points. 

Next to add points onto the lines the lines the feature vertices to points tool was used.  This can be found in Data Management tools>Features>Feature Vertices To Points.  This tool will simply add a point on to the end of each line to make the map easier to understand and compare to the real world features.
Figure 7: this picture is showing the map after the tool, feature vertices
to points was used for the first two base points.

Once all of these process were down only little tweaks were needed to complete the final map.  The 100 lines with points all appeared on the map coming from three different base points.   The feature layers created were saved to the geodatabase and a projection of WGS 84 was used since latitude and longitude were being used instead of meters. 

Results

The final results of our map were fairly accurate.  I was not pleased with the difference between the compass and lazer collection, the compass was way to far off compared to the lazer and real world points.  This could be because we were not precise enough using the compass, the compass was damaged and not working correctly, or something else.  We did notice that the compass was not working correctly when we were standing next to the pole at our first base point as you can see in figure 2.  However, when we changed base points we were not next the the pole and the compass was still significantly off from the lazer azimuth points. 

Figure 8: This is a small to medium scale of the image.  All the points appear in lime green.  Some
error occurred as you can see points appearing in the street or on top of buildings which were not
surveyed by us.  However the our base points were very accurate because the app used by my partner drew.  

Figure 9: Red dots= Compass points.  Green dots= Laser point.  When comparing the two different
colors there is inconsistency.  In some cases they are very close to each-other in others they differ greatly.
Also sometimes the lazer point is way off and other times the compass point is off.  This makes it very
difficult to understand which device failed us.  

Conclusion

Overall the surveying tended to be somewhat accurate, combing both the compass and lazer points would give a very accurate map of the features surveyed in this area.  Our distance measurements were very accurate along with the base points being perfect.  This allowed us to only make errors when recording the azimuth.  The results we got were fairly good but could have been better and there is room for improvement if this assignment was done again.  With out us knowing if the lazer actually detecting the feature we wanted or bouncing off something else it can be less accurate than the compass, which I would have not predicted.  However, the compass failed at some points too, one reason could be magnetic disturbances but that should have affected the lazer too unless it takes this into account.  

Collecting data points can be done in many different ways, using azimuth and distance can give a quick an accurate results.  This can be done in almost any weather and can be done the old fashion way, the compass, or by using new technology, the lazer.  This activity has taught me that using new technology may not always be the best.  When looking at the results the lazer data points some are not accurate compared to the compass showing that in some cases it is wise to use both new and old technology.  

Sunday, February 16, 2014

Field Activity #3: Unmanned Aerial System Mission Planning


Introduction
The goal of this exercise is to improve critical thinking when planning for different scenarios encountered by geographers. Five different scenarios were given with a goal of devising a plan on how to solve the scenarios.  While planning for the scenarios the use of a UAS (Unmanned Aerial System) was highly recommended to be a big factor in the solving process because the scenarios involved an image of the area to be taken.  For each scenario a plan was thought through to include: costs, type of UAS, type of sensor, GIS software, time of year and any other factors that were needed to complete the process.  However, because of the inexperience of the class, only the leg work of the scenarios were thought through to give an overview on how to solve the mission. 
Scenarios

 Scenario 1
v  A military testing range is having problems engaging in conducting its training exercises due to the presence of desert tortoises. They currently spend millions of dollars doing ground based surveys to find their burrows. They want to know if you, as the geographer can find a better solution with UAS.

Using UAS to survey for desert tortoise burrows is a much quicker and more cost effective way to discover where the burrows are compared to ground based surveys. There are two main options that can provide high quality data for this kind of survey; LiDAR and supervised classification using aerial imagery.

LiDAR can be used for this project because it collects elevation data in the form of a point-cloud. The LiDAR sensor shoots a laser at the ground and as the beam is reflected back it records the elevation it was reflected at. The LiDAR sensor requires a large UAS because of its weight so most rotary propeller UASs are out of the question but some fixed wing options will work such as in figure 1 below


figure 1: A fixed wing UAV, capable of being equipped with a LiDAR sensor

Once the LiDAR data has been processed a DEM (digital elevation model) will be created. After knowing how deep the tortoise burrows are a base height should be set that is that many feet/inches above the base height of the data. This will create a DEM with the negative elevation representing the tortoise burrows.

This option is costly but if millions of dollars are being spent on ground based surveys it would be well worth it to use a UAS in this fashion. A second option which will most like be much less expensive would be to fly a UAS and to have it take images of the ground and from these images use a supervised classification to automatically pick out where any tortoise burrows may be.

A supervised classification works by having the user select representative areas using reference sources such as high resolution imagery. The software then characterizes the statistical patterns of the representative areas and classifies the image. The use of a multi-band camera makes the classification scheme much more accurate. This is because the camera records data from a scene as individual color values. From these values a spectral signature can be derived. Using this signature, software such as ERDAS Imagine, will select pixels on the image which are within a specified range of the signature creating an image with one color representing a specific feature such as blue for all water.

This will reduce time in discovering tortoise burrows because the burrows have a unique spectral signature. Since the upturned soil will stand out from the ground it will be easy to select the burrow on an image and specify that all pixels with similar spectral signatures should be classified the same.

This process does involve some ground truthing to verify that the classified burrows are actually burrows and not randomly selected pixels on an image that happen to be similar. Having the person classifying the images will be best because they will know the exact area of where the burrows are.  

A camera that captures imagery in multiple bands that would be excellent for this kind of task is the UltraCam shown in figure 2 below. This camera will produce high quality images with the capability to be used in a supervised classification.

Figure 2: Ultra Cam camera capable of taking images in panchromatic, red,
green, blue, and infrared channels


Scenario 2
v  A power line company spends lots of money on a helicopter company monitoring and fixing problems on their line. One of the biggest costs is the helicopter having to fly up to these things just to see if there is a problem with the tower. Another issue is the cost of just figuring how to get to the things from the closest airport.

Instead of using a helicopter and having someone investigate power line issues it would be much safer and more cost effective to use a rotary UAS (unmanned aerial system). The rotary UAS will be able to fly extremely close to the power line without risk of major damage to the pilot or anyone else if it comes in contact with the line. This is because of how the propellers on the UAS are positioned; they allow for a stable flight with the ability to make sharp turns. Figure 3 shows an image of a rotary UAS. Notice how the propellers are evenly distributed around the center of the vehicle. Pictures of any damage can be taken with ease because the rotary UAS is able to hover in place and can provide not only pictures of the damage but real time video of any issues.

Figure 3: Rotary UAS equipped with a camera, propellers allow
the the camera to stay stable

A major advantage to using a UAS like this is that you can launch and land the vehicle from virtually anywhere. Not only will this rid the need of an airport but it will also eliminate having to waste time waiting for a helicopter to arrive near the power line. Having a helicopter fly close to power lines creates an issue of pilot safety and also the safety of anyone who may be on the ground. Cameras can take amazingly high quality images from a distance but even then you could receive higher quality by using a similar camera mounted onto a rotary UAS and have it fly in and hover much closer to the power line.

A disadvantage to using the UAS is that typically these types of vehicles have less flight time. This is where a helicopter outdoes the UAS. Even though the flight time may be less the cost of a potential injury to anyone involved in surveying is nonexistent with the UAS since the pilot can be stationed almost anywhere.

Scenario 3
v  A pineapple plantation has about 8000 acres, and they want you to give them an idea of where they have vegetation that is not healthy, as well as help them out with when might be a good time to harvest.

When examining the task of finding healthy vegetation over an 8000 acre area the cheapest option I can think of would be to download a LANDSAT image for the area then examine the infrared color band. LANDSAT is an abbreviation for Land Remote-Sensing Satellite which is in orbit around the world with an interval rate of 16 days for the newest satellite (LANDSAT 8). What that means is that every 16 days there will be a new image for the same area. LANDSAT has sensors which are able to record light reflectance from the ground similar to what a normal camera would do but it can also record the infrared energy being emitted which can be used for vegetation analysis because the healthier a plant is the more infrared energy it will emit which will be recorded by the sensor. The files downloaded from LANDSAT represent each band the satellite records light in (red, blue, green, infrared, shortwave infrared, etc.). These bands come in black and white TIFF files which are able to be used/opened in virtually any kind of image manipulation software. The TIFF files are black and white because of how the sensor records the color for each band. For anything blue, such as water, the pixels that make up the water will have a higher pixel value than pixels for land. The same principal applies to green objects such as plants and grass and so on for other colors. The infrared band will give higher pixel values to pixels representing objects that emit more infrared radiation than other objects. The infrared band would be opened using any kind of standard image viewing software. The more white an area is the more infrared energy being emitted thus the healthier the vegetation. In figure 4 below you can see that agricultural fields are much healthier and ready to be harvested than other natural areas in the image. 


Figure 4: Landsat image of healthy vegetation appearing in white, the red
circles are showing the healthy vegetation
This option is completely free as long as you have an internet connection and a way to unzip the downloaded file then be able to view the files. Although this option saves a lot of money it does have a few downfalls. First, since the satellite is on a 16 day interval you won’t be able to have images be taken on demand and even if you find an image for a date you want there is a chance it could be filled with clouds which would distort or even block the ground altogether. Assuming you go with this method of using the LANDSAT images you may run into an even bigger problem which would make you start over completely; satellite failure. This has already happened to the previous LANDSAT 7 satellite. The images taken from LANDSAT 7 would be of similar quality to LANDSAT 8 but they include a large amount of missing pixel data so all of the images produced are virtually useless for any kind of analysis like checking on the health of a pineapple plantation.

A second option would be to attach an infrared camera onto a fixed wing UAS (unmanned aerial system) and have it fly over the plantation recording infrared radiation producing an image which would be very similar to the one produced by LANDSAT. Figure 5 below shows an infrared camera capable of being attached to a UAS. This option of using a UAS will include a cost of a couple thousand dollars, most of which going to infrared sensor and UAS, but the money saved in not having workers check on the entire plantation’s health might be worth it. By using the UAS you would be able to have on demand infrared images taken of the plantation instead of waiting and hoping that the image from LANDSAT is of high quality.

Figure 5: Infrared scanner capable, used to take images in infrared

To discover the best time to harvest you could examine the infrared images to see when the plantation is mostly white meaning healthy. By using LANDSAT images you have access to images from previous years so you could start to see a trend in when the plantation is at its peak health and ready to be harvested. The LANDSAT images would give a good approximation of time to see this trend but the use of a UAV with an infrared would give a better look at exactly when the plantation is at peak health. Since LANDSAT is free to use it may not be a bad idea to investigate those images and to use the UAV in conjunction.
Scenario 4
v  An oil pipeline running through the Niger River delta is showing some signs of leaking. This is impacting both agriculture and loss of revenue to the company.

First many factors need to be accounted for, the agriculture could be also affected by other factors including a drought, bad soil, and over production.  Also the Niger River is known as being one of the most polluted rivers in the World, thus fixing the oil leaking might not lead to wasted agriculture area or crops.  Many questions will need to be asked before starting the project including: what time of the year is it?  This will affect the river water level and the spread of the oil.  If the Niger River water level is high the disperse of the oil leakage will be effecting the crops more.  Also, the description of the crops should be known, are they being harvested at this time or is the season in a transition?  First an image of the area should be taken to find out where the leakage is occurring.  When looking for an oil leakage, areas of black should be identified, the color of oil.  Also the area of black will be most heavy near the leak and then start to spread out as it travels down the river.  If the river is relatively clear, which should also be known before taking the image, the oil leak should be relatively easy to find.  This image can be taken either by an UAV (unmanned aerial vehicle) controlled by a computer or by a balloon, depending on the expense of the equipment and weather.  The disadvantage of using a UAV to take the image is it will be expensive ranging in the thousands, but it will be the easiest and most efficient way to take the image with the range the UAV can have.  A ‘normal’ high quality camera should be fine for finding the oil leak, no special effects on the camera or image should not have to be used.  The advantage of using a balloon to take the image is it will be very cheap and relatively easy to use compared to flying a UAV.  The disadvantage is the balloon may be hard to control with the wind and the range the balloon has compared to the UAV will be less.  However, a third option can be used, to get more accuracy, to determine the oil leakage by looking at vegetation health using a near infrared sensor. The health of the agriculture should be in most danger surrounding the oil leakage then getting healthier when moving away from the leak.  The near infrared image will show the healthy vegetation appearing in white and the unhealthy vegetation converting from gray to black.  Knowing where the agriculture is most unhealthy will help determine the area of oil spill.  This device will be more expensive and will have to be used by an unmanned aerial system because of the risk of losing the sensor. 

Using the UAV to take an image of the Niger River Delta to find the oil leak is the best option in this scenario.  It will on the higher end of the cost but with a serious problem, like an oil leak, the best option should be used.  Also using a near infrared scanner to look at vegetation could also be used along with the UAV.  After these steps are taken and clean images are produced the oil leak should be able to be found and fixed, helping the revenue and stopping the contamination of crops.  Two links that sell UAS; the first is less expensive of less quality and the second being more expensive and having more options of UAVs.



Figure 6. CAPTION: Image of a UAV, being placed in the air ready to be flown
around and used to capture images.  The military uses UAVs to
capture aerial images of images. 
  
Scenario 5
v  A mining company wants to get a better idea of the volume they remove each week. They don’t have the money for LiDAR, but want to engage in 3D analysis.

In order for you to figure out how much ore you are removing from the open pit mine, you will need to obtain 3 dimensional images of the mine to ultimately create a DEM (digital elevation model) of the mine. Obtaining these 3 dimensional images can be done through Photogrammetry camera systems mounted on a fixed wing UAS. Photogrammetry camera systems have automated film advance and exposure controls, as well as long continuous rolls of film. Aerial photographs should be taken in continuous sequence with an approximate 60% overlap. This overlap area of adjacent images enables 3 dimensional analysis for extraction of point elevations and contours. Once the images have been shot by the fixed wing UAS, a technique called least squares stereo matching can be used to produce a dense array of x, y, z data. This is commonly called a point cloud. A DEM image like the one below (figure 7) can then be modeled in ArcGIS to accurately reflect contours of the mine as well as the elevation levels of the mine.


Figure 7: Digital Elevation Model, showing elevations.  Red higher
elevation and blue lower elevation
Since you will know the elevation levels of the mine, every new DEM created with subsequent point clouds will reflect the elevation changes that occurred over a given period of time. This change in elevation will allow you to see the volume of ore being taken out of the mine. Obtaining an elevation point cloud with a fixed wing UAS equipped with a photogrammetry camera system, is much faster than manually surveying the mine. It can be done as often as needed with relative ease, saving your company massive amounts of time and ultimately money. This method is not as accurate as using LIDAR data, but it is much cheaper. If you were to take weekly readings of the mine using LIDAR you would spend a fortune on data collection. I see photogrammetry as your most viable option if you are set on taking weekly volume tests.

Sunday, February 9, 2014

Field Activity 2: Visualizing and Refining Terrain Survey

Introduction

This project is a follow up to the previous project, creation of a digital elevation surface.  The task was to import the data collected from the sand box terrain as an excel file into ArcMap and ArcScene.  Then to create a surface that best represents the data collected using an interpolation method.  These methods include IDW, Natural Neighbors, Kriging, Spline, and TIN which will be explained in further detail about what which is later in the blog.  The next task of this assignment was to reevaluate the data and possibly revisiting the sandbox and taking more data points (if any areas seem weak) to enhance the image.

Methods

After surveying the sandbox terrain and collecting data points the next step is import the excel file into ArcMap or ArcScene.  The difference between the two is that ArcScene has the capability of viewing images in 3D.  This project uses different elevation points making ArcScene very useful when converting the points into an image.  The first step is to add the excel file into ArcScene then converting the excel file into a point feature class.  This can be done by clicking file>add data> and filling out the rest correcting to fill needs.  A step by step process of how to convert data into a point feature class can be found here.  No coordinate system or units were used in the process because the data would become skewed and changed if unites were used.  Points will then appear on ArcScene differentiating in elevation, now ready to be converted into different interpolation techniques.  Using the ArcToolbox and various techniques under 3D analyst the point feature class was converted into the five techniques I mentioned earlier.  
Figure 1: Image of the ArcToolbox in ArcScene
3D Analyst Tools opened
After each tool was used to create an image in form of IDW, Natural Neighbors, Krging, Spline, and TIN, the best product was chosen to represent the data which I will explain in my discussion of the blog.  A step of this activity was to revisit the sandbox and collect more data.  However, my group did complete this step because the data collected was very accurate to our original model.  The team did a great job of coming up with an easy and efficient way to collect many data points by using the rope system I mentioned in my previous blog .  Also another factor in which we did not collect more data points is because there were several snow storms which ruined our surface.  It would be very hard to re-create the exact surface we used in our original data collection.   It was a group decision not too recollect data and all thought that it was not necessary and inefficient.

This section will discuss the different interpolation techniques used on ArcScene to create surfaces to view the data elevation points.  More on each technique can be found here on a ArcGIS help page.

IDW 
IDW stands for Inverse Distance Weighted, it estimates cell values by averaging the values of sample data points in the neighborhood of each processing cell (ArcGIS Help). The IDW was created in ArcScene to show a 3D image to better represent the collection of data.  420 points were added to the number of points box when creating this tool from the tool box.  420 points were added because our group collected 420 points.
Figure 2: 3D IDW
Red- high elevation, Blue- low elevation

Natural Neighbors
This method finds the closest subset of input samples to a query point and applies weights to them based on proportionate areas to interpolate a value.  Natural neighbor works by weighting each point by how close it is to other points in the same from the cell being used (ArcGIS Help).  However, this time no points were added to the map because it was not required, similar to they style of TIN.  This image was created in ArcMap and is a 2D representation of the data collected.
Figure 3: 2D image of Natural Neighbor
Brown representing high points and Blue low points

Kriging
Kriging is an advanced geostatistical procedure that generates an estimated surface from a scattered set of points with z-values.  Kriging determines height by looking at each value compared to other values.  I cut the the number of points in half and entered 210 for the number of points in the kriging creation box.  9 classes were used to represent the data green being the deepest and dark red being the highest elevation.  This is a 3D model of the data created in ArcScene.  
Figure 4: 3D Kriging
Red-high points, Green- low points

Spline
Spine uses an interpolation method that estimates values using a mathematical function that minimizes overall surface curvature, resulting in a smooth surface that passes exactly through the input points (ArcGIS Help).   420 points were also entered when creating this method making it similar to natural neighbor and IDW.  Spline also shows cone shaped areas that were not present in our actual creation of the data.  The creation in the snow was very smooth and flat in most areas.  

Figure 5: 3D Spline flipped to show
the other end of the creaton
Figure 6: 3D Spline
Red - high elevation Blue - low elevation













TIN
TIN stands for Triangulated irregular network and uses Z values and cell centers to fully cover the perimeter of the surface (Arc Help).  The TIN method displayed the data in a very unique way compared to the rest of the methods.  Digital triangles are placed between the nearest data points connecting the triangles.  
Figure 7: 3D TIN 
Discussion

After creating all five of the interpolation methods, I came to the conclusion that the kriging method best represents the data.  The kriging method, figure 4, best represents the data because of the smoothness and the easiness to interpret the data.  Compared to the other 4 methods, kriging has a smoothness and clearness about the image created.  Only using 210 points compared to the others where 420 were used contributed to the best image being created.  If I had been more educated in ArcGIS and ArcScene the outcomes could have been a lot different, but since I am still an amateur in this area most of the creations were not clean and hard to read.  A big key in choosing the kriging method was how it represented the deepest part of the model.  Looking through each figure it is easy to tell the bottom right corner is very deep and hard to understand.  The kriging model best represents this corner with the smoothness it represents. 
     Figure 2, the IDW technique, was the worst out of the five when creating a continuous surface that displayed the data.  Those cone shapes you can see in the image look very bad and it does represent the data in a good way at all.  I believe that the image is very skewed and cone shaped because there were 420 points collected. If there were less points collected then the image would become more clear and possibly useful.  
     Natural Neighbor represented the data very well but was not my favorite.  This is very similar to kriging but it does not appear as smooth as kriging because of some jaggedness in the surface when changing colors.
     Kriging represent the data collected by my group the best.  The continuous surface created looks very smooth as each color blends into each other making the image appear clean.  I used 9 classes to represent the data.  The bottom right corner where the data is the deepest it looks very clean and easy to read compared to the rest of the methods. 
     The spline method is very similar to natural neighbor as it represents the data in an un-smooth way.  The area where the data is most deep is badly represented and difficult to read in a 3D form in ArcScene, making spline not as useful as some other techniques.
     Figure 7, I really liked how deep the TIN data would appear in ArcScene, really displaying the different elevations that were present in the surface.  However, I would not choose this method to best represent the data collected because sometimes the triangles do not represent the smoothness of data and show them as triangular shapes instead.  The data created by our group did not represent any triangles when creating the model, making the TIN method hard to represent the data.  

     
Conclusion

This field activity along with combing the week before was very fun and challenging to complete.  It definently tested critically thinking skills to come up with creating a surface and a grid system to measure the elevation.  The hardest thing for me was converting the data into continuous images on ArcGIS.  I have never created any of these types other than TIN, with my inexperience with these methods I found the Arc help menu very useful to create  the images.  I thought the team I belong to did a fantastic job in completing the task in a efficient and successful way allowing the project to be completed without many troubles.