Survey Techniques (Continued)

INTRODUCTION

This post is a continuation of the "Improvised Survey Techniques to Create a Digital Elevation Surface" report. The previous report included a method of sampling elevation points within a sandbox and recording the data to produce a digital elevation surface. For this report, the improvised survey data collected previously is normalized. As defined by ESRI, data normalization is the process of organizing, analyzing, and cleaning data to increase efficiency for data use and sharing.

This included restructuring notes from the field journal into standardized columns of x,y, and z values for the sandbox grid to transfer into ArcMap. The elevation data was then modeled using various forms of interpolation. The systematic approach to surveying data points allowed a list of elevations to be made on a grid and left out all elevations between. Therefore, choosing the right interpolation method is important because it performs an estimate of elevations between each point collected. 

METHODS

After the data was normalized, the points were taken from Excel to ArcMap using an X,Y event layer. This was then exported as a feature class into a geodatabase. Since the points were recorded in relation to a reference point at (0,0), a cadastral coordinate system is used without projecting the data. The grid was then ran through various interpolation methods to determine the best technique for the data gathered. The following interpolations as defined by ESRI were used.

Inverse Distance Weighted (IDW):

uses a method of interpolation that estimates cell values by averaging the values of sample data points in the neighborhood of each processing cell. The closer a point is to the center of the cell being estimated, the more influence, or weight, it has in the averaging process. Because IDW uses a vast number of cells to define a respective cell, it features relatively high resource demand, but can yield positive interrelated results.

Spline:

uses an interpolation method that estimates values using a mathematical function that minimizes overall surface curvature, resulting in a smooth surface that passes exactly through the input points. This is good for terrain that has lower elevation variability.

Natural Neighbor:

finds the closest subset of input samples to a query point and applies weights to them based on proportionate areas to interpolate a value. Since it only uses neighboring points, it is better suited for compact datasets and terrain that has higher elevation variability.

Kriging:

an advanced geostatistical procedure that generates an estimated surface from a scattered set of points with z-values. Since it draws from scattered points and has no definer for which point is utilized  repeated data points will be used with a common X and Y values in a dataset.

TIN:

a vector data structure that partitions geographic space into contiguous, nonoverlapping triangles. These sample points are connected by lines to form Delaunay triangles. TINs require coordinates in defined physical distance units (not decimal degrees) for proper processing, and have implications for the measuring of planimetric and surface areas, as well as volume. Additionally, TINs have the potential for higher resolution in ares with high elevation variability.

After being interpolated and modeled in ArcMap, the raster was inputted to ArcScene for a 3D view of the elevation change. The 2D rasters shown in ArcScene float on a custom surface to emulate the original data points. The stretched color scheme provides a visually-stimulating, easy to decipher view of determining elevation. These rasters were then exported as a jpeg and inserted into ArcMap to give a better view at a 3D angle to determine actual surface elevation.

RESULTS/DISCUSSION

IDW

The IDW interpolation method showed a very good representation of surface elevation, but at the cost of smoothness (Figure 1). The IDW distinguished areas of greater elevation change with ease; for example, each valley and mound of each letter (JOE) is highly distinguishable. However, a goose-bump effect can be observed in areas of nearly the same elevation.  
Figure 1: Digital surface model using the IDW interpolation method

Spline

The spline interpolation method visually showed the smoothest surface all while best preserving elevation data and best representing elevation of the grid (Figure 2). Here, the letters can all easily be observed and are not distorted with a goose-bump effect or sharp, jagged changes in elevation. 
Figure 2: Digital elevation model using the spline interpolation method

Natural Neighbor

The natural neighbor interpolation method had similar results to the spline technique in representing elevation (Figure 3). However, it showed its weakness in creating a smooth, natural surface as can be observed in the letter O.
Figure 3: Digital elevation model using the natural neighbor interpolation method

Kriging

The Kriging interpolation method showed more smoothness than the natural neighbor technique but at the cost of more accurately representing elevation (Figure 4). The valley that forms the J does not show the depth to the extent of the previous three methods. 
Figure 4: Digital elevation model using the Kriging interpolation method

TIN

Creating a TIN yielded a very accurate representation of elevation between surveyed data points (Figure 5). However, the triangles generated from the TIN do not represent a very realistic surface at the frequency of data points collected. 
Figure 5: Digital elevation model using the TIN interpolation method
When comparing the digital surface elevation models to the collected data from the sandbox (Figure 6), the interpolation greatly changed how the sandbox was modeled. After reviewing how each model turned out, the systematic sampling method used was a good way to produce digital models. With the majority of elevation variance through the mid-section, an adequate amount of points were rightfully focused to show the ridges and valleys. The TIN would have had better results on the curved elevation shape of the O if more points had been collected since it is difficult to naturally represent circular shapes with triangles. However data collection using string and a ruler makes more sampling points quite difficult.


Figure 6: The actual surface elevation of the sandbox collected in Survey Techniques Part I
Overall, the sampling and interpolation methods used allowed an area of elevation to be accurately depicted digitally. This allows for further analysis, calculations, and a better understanding of an area to be made.

CONCLUSIONS

The survey data collection and representation parallels techniques used in the field on a number of levels. In this report, a cadastral coordinate system was used by logging elevation points using a grid system based on a single point of origin. In the field, most likely points would be logged using a GPS device that would then be projected when transferring into a digital model. In this report, the rope grid simulated a datum in which elevation was recorded at a systematic interval that simulated data nodes in a projected coordinate system.

In the field, it's not always economically or time-wise realistic to collect as many points to be digitally modeled. In this case, choosing the correct interpolation method that would best represent the real-world area is especially important. The most efficient method in sampling should also be considered before data collection for maximizing accuracy given a set time or budget limit.

Finally, sampling and interpolation methods are not limited to exclusively elevation points but any surveyed data points that need to best represent an area. For example, a hot spot map modelling the frequency of an observation at given points would be considered a valid use of these techniques.

Comments

Popular posts from this blog

Pix4D: Processing UAS Data

Field Navigation Using GPS, Map & Compass