Sunday, May 17, 2015

Remote Sensing: Term Project

Marquette County, MI: Using Satellite Imagery and LiDAR for Remote Sensing Analysis

Introduction:

My term project focused on Marquette County, MI, but in different ways. Part of the lab examined all of Marquette County (Figure 1). The rest of the project focused on the Huron Mountain Club (HMC), which is a nature reserve in the northern part of Marquette County on the southern shore of Lake Superior (Figure 1). This specific location was chosen because I am doing research with Dr. Jol on a straindplain in this area. Remote sensing analysis of LiDAR data could help further my research.

Marquette County is located in northwest Upper Michigan, and has a population of 67,700 (United States Census Bureau, 2015). The area is a wilderness region, and contains part of the Hiawatha National Forest, the Huron National Wildlife Refuge, and part of the Ottawa National Forest. Lake Superior borders the northern edge of the county, which makes Marquette County a hotspot for coastal research. 

The four functions performed in the lab include: image mosaicking, pan-sharpening, creation of a DSM, and creation of a DTM.
Figure 1: Map of Marquette Coutny (upper) and the Huron Mountain Club (lower).


Remote Sensing Analysis:

Two adjacent satellite images of the Marquette County area were mosaicked to create one continuous image. The mosaicked images covered path 24, row 27, and path 24, row 28. These images were taken on 4/26/15 by the Landsat 8 OLI satellite. Bands 2-7 of Landsat 8 OLI were used to create the 30m satellite images (United States Geological Survey, 2014). In addition, 15m panchromatic images of the Marquette County area were mosaicked so that the mosaicked panchromatic image could be used to pan-sharpen the mosaicked 30m original image. Band 8 of Landsat 8 OLI was used to create the panchromatic image (United States Geological Survey, 2014).

Before mosaicking could occur, the individual band of each image had to be imported and converted from TIFF to image format using the Import Data tool in Erdas Imagine 2013. After the bands were downloaded, they were stacked to create the desired satellite image using the Layer Stack tool. To begin the mosaicking process, the images were added to the same viewer through the use of Multiple and Raster Options tabs in the Select Layers to Add window. The images were then mosaicked using the MosaicPro tool. Compute active area was selected under Image Area Options in the Add Images window for both images. The radiometric properties at the intersection of the images were synchronized to maintain a smooth color transition at the intersection of the images. This was accomplished by setting Use Histogram Matching and Overlap Areas in the Color Corrections tool of MosaicPro. Additionally, the Set Overlap Function was set to Overlay. The mosaicked image of the 30m satellite images of Marquette County, MI, can be seen in Figure 2, and the mosaicked image of the 15m pan-sharpened images can be seen in Figure 3.

The mosaicked 15m panchromatic image of the Marquette County area was used to pan-sharpen the mosaicked 30m reflective image of the same area. This was accomplished using the Resolution Merge feature under the Pan Sharpen icon in the Raster tab in ERDAS IMAGINE 2013. In the Resolution Merge window, the Multiplicative algorithm was selected for Method and Nearest Neighbor was chosen as the Resampling Technique. The results of the pan sharpening can be seen in Figures 4 and 5. Figure 4 shows a small scale comparison of the regular reflective image (left) and the pan-sharpened image (right). At this scale, the pan-sharpened image does not show much difference in the spatial quality of the image compared to the original 30m satellite image. In fact, the original image looks better due to the stronger coloring of the original image. Differences in spatial qualities between the images can be seen when you zoom into the same spot on the images (Figure 5). The pan-sharpened image shows more detail and is less pixelated than the original image. This helps in identification of the strandplain, which is at the center of the picture to the right of the river (Figure 5).

After the mosaicking and pan-sharpening had been completed for Marquette County, I focused in on the Huron Mountain Club (HMC), which is located in Upper Michigan on the southern shore of Lake Superior. For this particular location, I used LiDAR data to create a digital surface model (DSM) and digital terrain model (DTM). The LiDAR data was downloaded from the Digital Coast Data Access Viewer (NOAA Office for Coastal Management, 2015).

A digital surface model (DSM) and digital terrain model (DTM) were created using the LAS Dataset to Raster Tool in ArcMap 10.2.2. The results for the DSM and DTM can be seen in Figures 6 and 8, respectively.

The parameters for the DSM model were:
-Interpolation: Binning
-Cell Assignment Type: Maximum
-Void Filling: Natural Neighbor
-Sampling Type: Cellsize
-Sampling Value Field: 0.000656167979ft (0.0002m)

The parameters for the DTM model were:
-Interpolation: Binning
-Cell Assignment Type: Minimum
-Void Filling: Natural Neighbor
-Sampling Type: Cellsize
-Sampling Value Field: 0.000656167979ft (0.0002m)

I had troubles creating the DSM and DTM. After trouble shooting, I discovered that the sampling value had to be 4 times the average point spacing, which was about 1x10-5. This meant that I had to make my sampling value 0.0002m, which was equal to 0.000656167979ft. After I made the change to the sampling value field, the DSM and DTM products were successfully made.

Hillshades were created for the DSM and DTM images using the Hillshade tool under 3D Analyst Tools. The hillshades for the DSM and DTM can be seen in Figures 7 and 9, respectively.

A LiDAR image of the area showing point elevation was also captured, as seen in Figure 10. The image displays the different beach ridges that exist in the strandplain. This was very helpful for my research with Dr. Jol.


Figure 2: Mosaicked image of the 30m satellite images of Marquette County, MI.

Figure 3: Mosaicked image of 15m pan-sharpened images of Marquette County, MI.

Figure 4: The small scale comparison of pan-sharpened image (right) does not show much difference in spatial quality of the original satellite image (left). 

Figure 5: The pan-sharpened image (right) shows more detail than the original image (left).

Figure 6: DSM shows the coastline of the Huron Mountain Club (HMC)

Figure 7: DSM Hillshade

Figire 8: DTM shows the bare-earth model of the Huron Mountain Club.

Figure 9: DTM Hillshade

Figure 10: LiDAR of HMC shows the topography of the strandplain.

Conclusions:

Image mosaicking created a satellite image that spanned the entire area of Marquette County, MI. Pan-sharpening enhanced the spatial resolution of the image, which made analyzing features in the image easier and more accurate. The remote sensing analysis demonstrates that mosaicking and pan-sharpening are effective tools for creating and analyzing remotely sensed images.

The LiDAR data used for the project created very similar DSM and DTM hillshades. This was due to the very low variance in brightness values. The DTM was effective for showing the topography of the bare landscape, but the DSM did not show surface features very well. A proper DSM would show the many pine trees that exist in the Huron Mountains, but my DSM did not show them. The remote sensing analysis demonstrates that DTM’s and DSM’s are useful for studying landscapes, but the right kind of data must be used in order for these tools to be effective.

Future studies could be improved by acquiring LiDAR for all of Marquette County, and not just the coast of Lake Superior. Addiiontally, collected LiDAR data should be evaluated before remote sensing processing to determine if an appropriate range of brightness values exist to create proper DSM’s and DTM’s. This would ensure better results.

Overall, this project has helped me practice remote sensing skills that I have learned throughout the course. These skills will come in handy in my future career.


References

NOAA Office for Coastal Management. (2015). Digital Coast Data Access Viewer [LAS file]. Retrieved from http://coast.noaa.gov/dataviewer/#

United States Census Bureau. (2015). QuickFacts Beta. [Data file]. Retrieved from http://www.census.gov/quickfacts/table/PST045214/26103,00

United States Geological Survey. (2014). Earth Explorer [TIFF file]. Retrieved from http://earthexplorer.usgs.gov/

Thursday, May 7, 2015

Lab 8: Analysis of Spectral Reflectance Signatures for Earth Surfaces and Near Surface Materials

Goals and Background:

The purpose of this lab was to develop skills in measuring and interpreting spectral reflectance signatures of Earth surfaces and near surface materials. Spectral signatures were collected from satellite images, graphed, and analyzed. This lab will prepare me to collect and analyze spectral signatures for any Earth surface and near surface feature from multispectral remotely sensed images.

Methods:

The Landsat ETM+ image eau_claire2000.img (United States Geological Survey, 2015) was used to collect and analyze spectral signatures of near surface features and various Earth surfaces.The image covered Eau Claire and parts of Wisconsin and Minnesota. Spectral signatures of twelve materials and surfaces were collected in the lab as follows:
  1. Standing water
  2. Moving water
  3. Vegetation
  4. Riparian vegetation
  5. Crops 
  6. Urban grass
  7. Dry soil (uncultivated)
  8. Moist soil (uncultivated)
  9. Rock
  10. Asphalt highway
  11. Airport runway
  12. Concrete surface (Parking lot)
Spectral signatures were collected in Erdas Imagine 2013 using the spectral tools. The collection process first started by using the Polygon tool under the Drawing heading to create a polygon around the desired material/surface. The spectral signature of the polygon was then plotted using the Signature Editor tool. The plot was displayed using the Display Mean Plot Window tool.

Spectral signatures for each individual material/surface can be seen in Figure 1-12. Descriptions of the results are discussed in Table 1. Differences between dry and moist soil spectral signatures are seen in Figure 13. All spectral signatures are plotted on one plot in Figure 14.


Results:

Signature
Band with highest reflectance
Band with lowest spectral reflectance
1
Blue (0.45-0.5um)
Near and mid infrared bands (0.7-3.0um).
2
Blue (0.45-0.5um)
Mid infrared (1.3-3.0um)
3
Red and near infrared (0.62-1.3um)
Green (0.5-0.58um)
4
Red and near infrared (0.62-1.3um)
Green (0.5-0.58um)
5
Red and near infrared (0.62-1.3um)
Green (0.5-0.58um)
6
Blue (0.45-0.5um)
Red and near infrared (0.62-1.3um)
7
Near and mid infrared (0.72-3.0um)
Red and near infrared (0.62-1.3um)
8
Near and mid infrared (0.72-3.0um)
Red and near infrared (0.62-1.3um)
9
Near and mid infrared (0.72-3.0um)
Green (0.5-0.58um)
10
Near and mid infrared (0.72-3.0um)
Red and near infrared (0.62-1.3um)
11
Near and mid infrared (0.72-3.0um)
Red and near infrared (0.62-1.3um)
12
Blue (0.45-0.5um)
Red and near infrared (0.62-1.3um)

Table 1


Figure 1: Mean plot for standing water.
Standing water had the highest reflectance in the blue band because of Rayleigh scattering. Low reflectance in the near and mid infrared bands was due to water absorbing radiation. 

Figure 2: Mean plot for moving water. 

Figure 3: Mean plot for vegetation.
Vegetation had the highest reflectance in the red and near infrared bands (0.62-1.3um) because the plants block out harmful radiation present within those bands. The lowest reflectance was in the green band (0.5-0.58um) because plants absorb the radiation for photosynthesis.

Figure 4: Mean plot for riparian vegetation. 

Figure 5: Mean plot for crops.

Figure 6: Mean plot for urban grass.

Figure 7: Mean plot for dry soil (uncultivated).

Figure 8: Mean plot for moist soil (uncultivated).

Figure 9: Mean plot for rock.

Figure 10: Mean plot for asphalt highway.

Figure 11: Mean plot for airport runway.

Figure 12: Mean plot for concrete surface (parking lot).

Figure 13: Mean plot shows differences between dry and moist soil.
Dry and moist soil varies the most in the mid infrared band (1.3-3.0um) because water in the moist soil absorbs more radiation.

Figure 14: Mean plot showing spectral signatures for all surfaces.
Spectral signatures for vegetation, riparian vegetation, and crops were very similar. This is because they are all types of vegetation that reflect radiation in very similar ways. Additionally, asphalt and airport runway spectral signatures were very similar. This is because airport runways are commonly made out of asphalt. The spectral signature for the rock was very different from other hard surfaces like asphalt, airport runway, and concrete surface. This is because asphalt, airport runways, and concrete surfaces have been made by humans, which alters their natural reflective properties that rock displays. Urban grass had a very different spectral signature compared to vegetation, riparian vegetation, and crops. This is because urban grass has less moisture content, which causes it to have higher reflectance in the blue and green bands.

Sources:

United States Geological Survey. (2015). [Satellite image is in img. format]. Earth Resources Observation and Science Center. Retrieved from http://eros.usgs.gov/

Thursday, April 30, 2015

Lab 7: Stereoscopy and Orthorectification

Goals and Background:

The purpose of this lab was to gain skills in photogrammetry using aerial and satellite images. The first part of the lab developed an understanding of how to calculate scales, measure area and perimeters of features, and calculate relief displacement. The rest of the lab developed skills in stereoscopy and orthorectification of satellite images. The lab helped me develop photogrammetry skills, which will be useful in a remote sensing career.

Methods:

Part 1: Calculating scale of nearly vertical aerial photographs

Section 1:

The scale of an aerial photograph (Eau Claire_West-se.jpg) of Eau Claire (Erdas Imagine, 2009). was calculated using the equation below:

S = pd/gd  (1)

Where:
S = scale
pd = distance on the map
gd = distance in the real world

The distance on the map was measured using a ruler and the ground distance was provided (8822.47 ft).

The scale of a second aerial photograph, collected by a high altitude reconnaissance aircraft (United States Department of Agriculture, 2005), was calculated using the equation below:

S = f/(H-h)  (2)

Where:
S = scale
f = focal lens length
H = altitude above sea level
h = elevation of terrain

The focal lens length (152 mm), altitude above sea level (20,000 ft), and elevation of terrain (796 ft for Eau Claire) was provided.

Section 2: 

The area and perimeter of a lagoon in an aerial photograph was measured using the "Measure Perimeters and Area" tool in Erdas Imagine 2013. Area was reported both in ha and acres, whiles perimeter was recorded both in meters and miles.

Section 3:

The relief displacement of a smoke stack in an aerial photograph of Eau Claire was calculated using the below equation:

d = (h*r)/H  (3)

Where;
d = relief displacement
h = height of object (real world)
r = radial distance of top of displaced object from principal point (photo)
H = height of camera above local datum

The height of the camera above the local datum was 3980 ft,  The real world height of the smoke stack and the radial distance were calculated using a ruler and the scale of the aerial photograph (1: 3209).


Part 2: Stereoscopy

Ground control points (GCPs) were used to create a 3D display of the City of Eau Claire (Figure 1). A DEM (ec_dem2.img) was used to create the stereoscopic view of Eau Claire (ec_city.img) using the Anaglyph tool.

Part 3: Orthorectification

The Erdas Imagine Lecia Photogrammetric Suite (LPS) was used to orthorectify and planimetrically correct a satellite image of Palm Springs, CA (Figure 2).

Section 1:

A new block file was created in LPS Project Manager using SPOT satellite images of Palm Springs, CA (Erdas Imagine, 2009). The geometric model interface was set to "polynomial-based pushbroom" and the geogmetric model category was set to "SPOT pushbroom". In the "Projection Chooser" dialog (in Horizontal Reference Coordinate System section of the block Property Setup dialog) the following parameters were set:

Projection Type: UTM
Spheroid Name: Clarke 1866
Datum name; NAD27(CONUS)
UTM Zone: 11
North or South field: North
Horizontal Units: Meters

No changes were made to the Vertical Section of the reference Coordinate System.

Section 2: 

Satellite imagery of Palm Spring, CA (Spot_pan.img) was added to the block and the parameters of the SPOT pushbroom sensor were verified during this section of the lab. No changes needed to be made to the image, but the verification process was carried out so that the sensor would be specified. The verification processed turned the Int. column in LPS Project Manager to green, which would later allow me to complete the orthorectification process.

Section 3:

GCPs were collected for the image spot_pan.img using the Classic Point Measurement Tool. An orthorectified image (sx_ortho.img) was used for collecting GCPs. The GCPs were collected by first selecting points on the reference image (sx_ortho.img). Reference control points had to match provided control point coordinates within 10 meters (Table 1). Corresponding reference coordinates were collected on the spot_pan.img using the Point Measurement Tool. Coordinates had to match provided control point coordinates within 2 pixels (Table 2). 


Point ID
X Reference
Y Reference
1
566189.190
3773586.979
2
555690.659
3728387.770
3
501918.953
3732595.411
4
515114.084
3759740.576
5
543537.306
3779981.255`
6
558640.300
3751516.718
7
532062.982
3724946.633
8
539381.670
3768419.388
9
526013.661
3753709.856
11
545372.750
3741643.250
12
540901.659
3746876.633
Table 1


Point ID
Image Name
X File
Y File
1
Spot_pan
5239.468
337.384
2
Spot_pan
5191.590
1969.546
3
Spot_pan
230.925
5378.823
4
Spot_pan
869.542
2487.996
5
Spot_pan
3027.570
51.432
6
Spot_pan
4999.412
2636.848
7
Spot_pan
3064.254
5673.794
8
Spot_pan
2890.880
1258.852
9
Spot_pan
1978.138
2919.004
11
Spot_pan
3982.969
3817.813
12
Spot_pan
3469.092
3367.939
Table 2

After the first two GCPs were collected, the Automatic (x.y) drive function was used to help collect GCPs faster. Using the Automatic (x,y) drive, coordinate values were input into the Measurement Point Tool, which placed the coordinate on the reference image and positioned the regular image in the approximate location of the coordinate.

The last two control points (Point ID 11 and Point ID 12), were collected from NAPP_2m-ortho.img, a different horizontal reference source. The 10th control point was skipped to make the difference between the two different horizontal reference sources clear.

Elevation information from the DEM file titled palm_springs_dem was collected for the horizontal reference GCPs obtained from the xs_ortho and NAPP_2m-ortho images. This task was accomplished using the Reset Vertical Reference Source icon in the Point Measurement tool palette. The Update Z Values on Selected Points icon was used to update z values of all reference points in the cell array based on the values in the palm_springs_dem.

Section 4:

The Type and Usage values for the control points were set using the Formula dialog in Point Measurement Tool. All type values were set to Full, while all Usage values were set to Control.

A second image (spot_panb.img) was added to the block because the collection of reference points on the first image was complete. Spot Pushbroom Frame Editor parameters were accepted, which tuned the Int column in the LPS Project Manager to green.

GCPs were collected on spot_panb.img based on the points already collected in spot_pan. Collected GCPs for the spot_panb.img had to match within two pixels of the values in Table 3. Point IDs 3, 4, and 7 were not collected for spot_panb.img becase they were not on the image.


Point ID
Image Name
X File
Y File
1
Spot_panb
2857.270
753.852
2
Spot_panb
3003.782
5387.892
5
Spot_panb
1022.701
644.456
6
Spot_panb
2736.125
3070.227
8
Spot_panb
937.482
1862.696
9
Spot_panb
221.445
3594.113
12
Spot_panb
1499.230
3923.753
Table 3

Section 5:

Tie points were collected for spot_pan.img and spot_panb.img using the Automatic Tie Point Generations Properties icon in the Point Measurement tool palette. The following parameters were used:

Image used: All Available
Initial Type: Exterior/header/GCP
Image Layer Used for Computation: 1
Intended Number of Points/Image: 40
Keep All Points: check box is unchecked

The accuracy of a few tie points were confirmed to ensure proper orthorectification.

Triangulation was performed using the LPS Project Manager. The following parameters were used:

Iterations With Relaxation value: 3
Image Coordinate Units for Report: Pixels
Ground Point Type: Same Weighted Values
Ground Point Standard Deviations: 15 (for X, Y, and Z fields)
Simple Gross Error Check Using: check box is checked
Times of Unit Weight: 3.0


The triangulation report was saved as a ASCII Text file. It should be noted the Ext. columns in the cell array were green after completing the triangulation process.

The final orthorectified image was created using Start Ortho Resampling Process icon in LPS Project Manager. The following parameters were used:

DTM Source: palm_springs_dem.img
Output Cell Sizes: 10.0 (for both X and Y)
Resampling method: Bilinear Interpolation
Input File Name: spot_panb
Use Current Cell Sizes: check box is checked

Section 6:

The orthorectified images were viewed using LPS Project Graphic Status and an image viewer (Figure 2). The swipe function was used to evaluate the spatial accuracy of the area of overlap in the images. The block file was saved before closing the program.

Results:


Figure 1: Anaglyph image of Eau Claire shows changes in elevation with 3D.
The elevation changes are most pronounced around Putnam Park at the center of the image,

Figure 2: The orthorectified images of Palms Springs, CA, is relatively seamless.

Sources:

Hexagon Geospatial. (2009). Erdas Imagine [computer software]. Georgia: Norcross. 

United States Department of Agriculture. (2005). [Satellite images in img. format]. National Agriculutre Imagery Program. Retrieved from: https://gdg.sc.egov.usda.gov/.

United States Department of Agriculture Natural Resources Conservation Service. (2010). [Digital elevation model for Eau Claire in dbf. format]. Retrieved from: http://www.nrcs.usda.gov/wps/portal/nrcs/site/national/home/.