- Date: Apr-2010 (process 1 of 26)
-
All the ASCII HYPACK navigation files collected were placed in their own folder on the computer based on day of collection. Each file has the following filename convention: LLL_TTTT.RAW where LLL is the line number and TTTT is the start time (UTC) of data collection in the format HHMM. In order to parse these files to extract navigation and depth information, two scripts were run under a Cygwin operating system. The first script "donav" is an executable shell script that cycles through all the *.RAW files in the folder and calls an AWK script to extract particular lines of information.
donav:
files=`ls *.RAW | cut -d. -f1`
for file in $files
do
awk -f awkit $file.RAW > $file.navdep
done
The AWK script "awkit" simply extracts any line of information in the HYPACK file that contains either the string "GPGGA" or "SDDBT" and writes this information to a new file with the extension navdep.
awkit:
{
if ($0 ~ /GPGGA|SDDBT/) {
print $0
}
}
This process step and all subsequent process steps were performed by the same person - VeeAnn A. Cross.
Person who carried out this activity:
VeeAnn A. Cross
U.S. Geological Survey
Marine Geologist
Woods Hole Coastal and Marine Science Center
Woods Hole,
MA
(508) 548-8700 x2251 (voice)
(508) 457-2310 (FAX)
vatnipp@usgs.gov
Data sources used in this process:
Data sources produced in this process:
- Date: Apr-2010 (process 2 of 26)
-
With the particular lines of interest extracted from the original HYPACK files, additional scripts were run to extract the specific information of interest. The next two scripts run are doholdhypack and awkholdhypack. The doholdhypack is a shell script run under Cygwin which cycles through all the *.navdep files in the folder and calls the awkholdhypack AWK script to process the file, with the results output to *.holdhypack. Of note, if the SDDBT record does not contain a depth value, a value of -9999 is written to the output file - acting as a nodata value.
doholdhypack:
files=`ls *.navdep | cut -d. -f1`
for file in $files
do
awk -f awkholdhypack $file.navdep > $file.holdhypack
done
The AWK script awkholdhypack:
BEGIN {
FS = ","
}
{
FS = ","
depth = -9999
if ($1 ~ /GPGGA/)
{
utctime = $2
latdeg = substr($3,1,2)
latmin = substr($3,3,6)
declat = latdeg + (latmin/60)
londeg = substr($5,1,3)
lonmin = substr($5,4,6)
declon = -1 * (londeg + (lonmin/60))
if (NR==1) {
holddepth = -9999
}
else {
printf("%s, %9.6f, %9.6f, %5.1f\n", holdutctime, holddeclon, holddeclat, holddepth)
}
holdutctime = utctime
holdutcdate = utcdate
holddeclon = declon
holddeclat = declat
holddepth = -9999
}
if ($1 ~ /SDDBT/)
{
if ($4 != "")
{
depthreal = $4
holddepth = depthreal
}
else
{
depthreal = -9999
holddepth = -9999
}
}
}
END {
printf("%s, %9.6f, %9.6f, %5.1f\n", holdutctime, holddeclon, holddeclat, holddepth)
}
Data sources used in this process:
Data sources produced in this process:
- Date: Apr-2010 (process 3 of 26)
-
With each individual HYPACK file processed, all the individual files on a given day were concatenated into a single file. Because the file naming convention starts with the line number, and surveying wasn't done in numerical order of lines, the appending of the files needs to be in the order the files were acquired. The time is in the filename, but another additional file written during acquisition was a LOG file containing the order of the lines. By editing this file and converting it to a shell script, the individual files can be concatenated in chronological order on each day. Running the shell script for each day under Cygwin successfully concatenated the files. What follows is the example from Julian day 103:
cat 201_1213.holdhypack \
202_1414.holdhypack \
203_1441.holdhypack \
204_1600.holdhypack > jd103hypack.csv
Data sources used in this process:
Data sources produced in this process:
- jd103hypack.csv
- jd104hypack.csv
- jd105hypack.csv
- Date: Apr-2010 (process 4 of 26)
-
Using VI editor, under Cygwin, the resulting CSV file for each day was edited to add the header line "gpstime, longitude, latitude, depth_m". With this header line added, the CSV file can be imported to ArcMap 9.2 as an event theme using Tools - Add XY data and defining the projection as Geographic, WGS84. This event theme is then converted to a point shapefile by right mouse click - data - export and saving a shapefile for each HYPACK event theme.
Data sources used in this process:
- jd103hypack.csv
- jd104hypack.csv
- jd105hypack.csv
Data sources produced in this process:
- jd103hypack.shp
- jd104hypack.shp
- jd105hypack.shp
- Date: Apr-2010 (process 5 of 26)
-
The table of this shapefile was edited in ArcMap 9.2 to add two additional attributes: gpsdate and jday. These attributes were populated with the appropriate information for each day of data collection.
Data sources used in this process:
- jd103hypack.shp
- jd104hypack.shp
- jd105hypack.shp
Data sources produced in this process:
- jd103hypack.shp
- jd104hypack.shp
- jd105hypack.shp
- Date: Apr-2010 (process 6 of 26)
-
Only the valid depth values are needed to create the bathymetry grid, so within ArcMap 9.2 under the shapefile properties, apply a definition query on the shapefile: "depth_m" <> -9999. Then use XTools Pro 5.2 to export the shapefile to a text file. Fields for export: gpstime, longitude, latitude, depth, jday. Save as type text, ANSI encoding.
Data sources used in this process:
- jd103hypack.shp
- jd104hypack.shp
- jd105hypack.shp
Data sources produced in this process:
- jd103_exp.txt
- jd104_exp.txt
- jd105_exp.txt
- Date: Apr-2010 (process 7 of 26)
-
Use an AWK script to reformat each text file into the format needed for the MATLAB scripts that will be used for tide corrections. The AWK script awkbathy_fmt:
BEGIN {
FS=","
}
{
FS=","
hr=substr($1,1,2)
min=substr($1,3,2)
sec=substr($1,5,2)
jday = $5
longitude = $2
latitude = $3
depth = $4
printf("%s\t%s\t%s\t%s\t%9.6f\t%9.6f\t%3.1f\n", jday, hr, min, sec, longitude, latitude, depth)
}
Data sources used in this process:
- jd103_exp.txt
- jd104_exp.txt
- jd105_exp.txt
Data sources produced in this process:
- jd103_exp.dat
- jd104_exp.dat
- jd105_exp.dat
- Date: Apr-2010 (process 8 of 26)
-
One day of equipment testing (JD102 - April 12, 2010) acquired resistivity data, but no HYPACK data. Additionally, in the western portion of the bay on JD105 (April 15) the HYPACK system was not recording, but navigation with bathymetry was being recorded by the CRP system during line L21F1 and L22F1. Although the test line on JD102 was not fully processed, the GPS file from that day was processed in the same manner as the other GPS files from the CRP data collection. An example of how the GPS files are handled can be read in the metadata for jd103gps_bestdepth.shp available from https://pubs.usgs.gov/of/2011/1039/html/ofr2011-1039-catalog.html. The processed shapefile from JD102, as well as selected records from JD105 (between GPS times 133421 and 140157) were exported using XTools Pro v 5.2 with the fields being exported - gpstime; longitude, latitude, depth_m and jday.
Data sources used in this process:
- jd102.shp
- jd105gps_spatjoin.shp
Data sources produced in this process:
- tstbathy.txt (JD102 resistivity portion)
- jd105gps_spatjoin_extradepths_exp.txt
- Date: Apr-2010 (process 9 of 26)
-
Each of the resistivity acquired bathymetry values were processed with the same AWK script (awkbathy_fmt) to generate files in the appropriate format for the MATLAB tide processing scripts.
awkbathy_fmt
BEGIN {
FS=","
}
{
FS=","
hr=substr($1,1,2)
min=substr($1,3,2)
sec=substr($1,5,2)
jday = $5
longitude = $2
latitude = $3
depth = $4
printf("%s\t%s\t%s\t%s\t%9.6f\t%9.6f\t%3.1f\n", jday, hr, min, sec, longitude, latitude, depth)
}
Data sources used in this process:
- tstbathy.txt (JD102 resistivity portion)
- jd105gps_spatjoin_extradepths_exp.txt
Data sources produced in this process:
- tstbathy.dat
- bathy.dat (JD105 formatted bathymetry)
- Date: Apr-2010 (process 10 of 26)
-
All of the HYPACK formatted bathymetry data and the resistivity data from JD102 were concatenated together in Cygwin using the command: cat tstbathy.dat jd103_hyp.txt jd104_hyp.txt jd105_hyp.txt > bathy.dat. The data from JD105 resistivity collection was handled separately since that gap wasn't noticed until later.
Data sources used in this process:
- tstbathy.dat
- jd103_hyp.txt
- jd104_hyp.txt
- jd105_hyp.txt
Data sources produced in this process:
- Date: Apr-2010 (process 11 of 26)
-
In order to tide correct the bathymetry, data from tide stations in the area was required. There are two tide stations in the Indian River Bay area. One is from Rosedale Beach, DE, and the other is from the Indian River Inlet. The Rosedale Beach tides are available from: https://waterdata.usgs.gov/usa/nwis/uv?site_no=01484540 and the Indian River Inlet tides are available from: https://waterdata.usgs.gov/de/nwis/uv/?site_no=01484683&. In both cases the tab-delimited format was downloaded for all the days covering the CRP data acquisition. For both tide stations it's important to note that the time recorded in the tide files is in local time - Eastern Standard Time (EST), even though at the time of interest (April 12-15, 2010) local time is actually EDT (Eastern Daylight Time). The offset between EST and UTC is -5 hours (to go from EST to UTC).
Other pertinent information from the website for the Rosedale Beach tide station: LOCATION.--Lat 38deg35min29.5sec, long. 75deg12min44.7sec, Sussex County, DE, Hydrologic Unit 02060010, on left bank attached to a privately owned fishing pier, at Seals Point, 1.9 miles west of Oak Orchard. DRAINAGE AREA.--Not determined. PERIOD OF RECORD.--April 1991 to current year. GAGE.--Water-stage recorder. Datum of gage is 0.0 ft above National Geodetic Vertical Datum of 1929. REMARKS.--This is a tidal station. Discharge are not determined at this location. U.S. Geological Survey satellite collection platform at station.
Other pertinent information from the website for the Indian River Inlet tide station:
LOCATION.--Lat 38deg36min35.4sec, long. 75deg04min04.8sec, Sussex County, DE, Hydrologic Unit 02060010, 0.3 mi northwest of the Indian River Inlet, 0.2 mi west of State Highway 1, 4.9 mi north of Bethany Beach and at the Indian River Coast Guard station. DRAINAGE AREA.--Not determined. PERIOD OF RECORD.--June 1988 to June 1989, April 1991 to current year. GAGE.--Water-stage recorder. Datum of gage is 0.0 ft above National Geodetic Vertical Datum of 1929. Mean Low-Low Water: Gage Datum + 1.51 ft. REMARKS.--This is a tidal station. Discharge are not determined at this location. U.S. Geological Survey satellite collection platform at station.
Data sources produced in this process:
- rosedale_beach.txt
- indianriver_inlet.txt
- Date: Apr-2010 (process 12 of 26)
-
Using VI under Cygwin removed the header in the text file of the tide station data.
Data sources used in this process:
- rosedale_beach.txt
- indianriver_inlet.txt
Data sources produced in this process:
- rosedale_beach_noheader.txt
- indianriver_inlet_noheader.txt
- Date: Apr-2010 (process 13 of 26)
-
The tide data needs to be reformatted to work with the MATLAB processing scripts. In order to do this, process the files through an AWK script awktide_fmt:
{
year = substr($3,1,4)
month = substr($3,6,2)
day = substr($3,9,2)
hr = substr($4,1,2)
min = substr($4,4,2)
utchr = hr + 5
tideval = $6
tideval_meters = tideval * 0.3048
printf("1\t%s\t%s\t%02d\t%02d\t%02d\t%02d\t%02.4f\t%2.3f\n", $2, year, >month, day, utchr, min, tideval_meters, tideval)
}
The first element written to the output file is the number 1. The value of this number isn't important, but the MATLAB script does expect the first element to be a number. Another important piece of information about this AWK script is that 5 is added to the hours in the tide file to convert from EST to UTC to be compatible with the bathymetry data. Also, the original tide data was recorded in feet, but the work I'm doing is in meters. So that calculation is also taken care of in the AWK script.
Data sources used in this process:
- rosedale_beach_noheader.txt
- indianriver_inlet_noheader.txt
Data sources produced in this process:
- rd_tides.dat
- ir_tides.dat
- Date: Apr-2010 (process 14 of 26)
-
MATLAB version 7.5.0.342 (R2007b) was used to process the tide stations and bathymetry. The commands issued in MATLAB rely on MATLAB m files written by the USGS in Woods Hole, MA. The series of scripts used are: julian.m, linterp.m, gregorian.m, s2hms.m, and hms2h.m. There are actually two separate bathymetry files (both called bathy.dat) in separate folders that are being processed. One is the HYPACK data plus the CRP bathymetry data from Julian day 102. The other bathy.dat file represents the bathymetry extracted from Julian day 105 when the HYPACK system was not recording. So this series of commands is actually run twice - in the two separate folders. The following MATLAB commands were issued to process the data including comments explaining some of the processes. The comment lines are prefaced with the "%".
%Step 1. Outside of Matlab, format the ASCII tide gauge data.
%Load the tide data.
load ir_tides.dat
load rd_tides.dat
greg_ir=ir_tides(:,3:7); %year month day hour minute needs to be in columns 3,4,5,6,7
greg_ir(:,6)=zeros(length(greg_ir(:,1)),1); %adding secs bc the tide file doesn't have any
greg_rd=rd_tides(:,3:7);
greg_rd(:,6)=zeros(length(greg_rd(:,1)),1);
%then establish the julian timebase for the observed tide data
jd_ir=julian(greg_ir);
jd_rd=julian(greg_rd);
h_ir=ir_tides(:,8); %observed tide in meters (in this case above NGVD29)
h_rd=rd_tides(:,8);
%Step 2: outside of Matlab I reformatted my bathy data. The original data was exported
%from a shapefile using XTools and then needed to be reformatted for my needs here
load bathy.dat %there are 2 bathy.dat file in 2 separate folders
jd0=julian([2010 1 1 0 0 0]); %starting point for yearday
%setting the julian time base for the ship stuff
%this assumes column 1=jd, 2=hour, 3=min, 4=sec
jdtrack=jd0+bathy(:,1)-1+(bathy(:,2)+bathy(:,3)/60+bathy(:,4)/3600)/24;
x=bathy(:,5); %longitude in decimal degrees in column 5
y=bathy(:,6); %latitude in decimal degrees in column 6
htrack=bathy(:,7);
%Step 3: interpolate observed tide data onto the ship track timebase using a linear
%interpolation. Splining the tide data could result in overshoots
tidetrack_ir=linterp(jd_ir,h_ir,jdtrack);
tidetrack_rd=linterp(jd_rd,h_rd,jdtrack);
h_corrected_ir=htrack-tidetrack_ir;
h_corrected_rd=htrack-tidetrack_rd;
%Step 4: output the information I want into a text file. I decided I basically want my
%original bathymetry data, plus the tide value and the corrected value
%I have to use the ' to get the columns flipped so the append thing works
alltidestuff=[bathy'; tidetrack_ir'; h_corrected_ir';tidetrack_rd';h_corrected_rd'];
outfile=['veetest'];
fid=fopen(outfile,'w');
fprintf(fid,'%03d:%02d:%02d:%02d, %9.6f, %9.6f, %3.1f, %3.1f, %3.1f, %3.1f, %3.1f \n',alltidestuff);
fclose(fid);
The result is a comma-delimited text file with 8 columns of information: jdtime, longitude, latitude, depth_m, ir_tide, ir_cor, rd_tide, rd_cor
The definition of this header information is as follows:
jdtime: julian day and time from the GPS
longitude: longitude from the GPS
latitude: latitude from the GPS
depth_m: depth in meters of the fathometer (as recorded by the GPS)
ir_tide: interpolated tide values at the GPS time from the Indian River Inlet tide station
ir_cor: the corrected depth in meters of the fathometer based on the Indian River tide value (depth_m - ir_tide)
rd_tide: interpolated tide values at the GPS time from the Rosedale Beach tide station
rd_cor: the corrected depth in meters of the fathometer based on the Rosedale tide value (depth_m - rd_tide)
The header line was added to the veetest file and saved under a new file name: hypbathytides.csv and jd105_moredepths.csv
Data sources used in this process:
Data sources produced in this process:
- hypbathytides.csv
- jd105_moredepths.csv
- Date: Apr-2010 (process 15 of 26)
-
Each of the resulting CSV files were added to ArcMap 9.2 as event themes: Tools - Add XY Data. The projection was defined as Geographic, WGS84. These event themes were then converted to shapefiles - right mouse click on the event theme - Data Export.
Data sources used in this process:
- hypbathytides.csv
- jd105_moredepths.csv
Data sources produced in this process:
- hypbathytides.shp
- jd105_moredepths.shp
- Date: Apr-2010 (process 16 of 26)
-
The resulting shapefiles are geographic, in order to do a weighted tide influence on each point, the point shapefiles are projected to UTM, Zone 18 so that distance values are in meters. Using ArcMap 9.2 - ArcToolbox - Data Management Tools - Projections and Transformations - Feature. Input each geographic shapefile, and output a new shapefile in a UTM, Zone 18, WGS84 projection - no transformation necessary.
Data sources used in this process:
- hypbathytides.shp
- jd105_moredepths.shp
Data sources produced in this process:
- hypbathytides_utm18.shp
- jd105_moredepths_utm18.shp
- Date: Apr-2010 (process 17 of 26)
-
The tide station locations are known, so that latitude/longitude information was added to a comma-delimited text file and added to ArcMap 9.2 as an event theme and converted to a shapefile. Within ArcMap 9.2, the Geographic, WGS84 shapefile was projected to a UTM, Zone 18, WGS84 shapefile.
Data sources used in this process:
Data sources produced in this process:
- tidestations.shp
- tidestations_utm18.shp
- Date: Apr-2010 (process 18 of 26)
-
The UTM tide station shapefile and the two shapefiles with the bathymetry and tide information were added to ArcMap 9.2. One tide station (Indian River Inlet) is at the eastern end of the study area, while the other tide station (Rosedale Beach) is at the western end of the study area. Instead of just using the adjustments from one tide station, a weighted tide adjustment was applied based on the distance from the tide stations. In order to get the distance to each tide station in the bathymetry shapefiles, the Near tool will be used (ArcToolbox - Analysis Tools - Proximity - Near). Prior to using the tool, add the following attributes to the shapefile (as type double): dist_inlet; dist_rose; wt_tide; wttide_dep. With these attributes added, apply a definition query to the tide station shapefile so only one station is present. The Near tool will only report the distance to the closest tide station if two are available. To get the distance to each tide station so that a weighted tide value can be calculated based on distance to each tide station, the Near tool will need to be run twice - with only one station available each time. So with one tide station visible, use the near tool with the following parameters: input-hypbathytides_utm18; near features - tidestations_utm18. leave everything else at default (don't use a search radius). Then copy the "near_dist" attribute (automatically created by the Near tool) to the appropriate distance attribute created for that station. Repeat the same process for the other tide station. Repeat the same process with the jd105_moredepths_utm18.shp (add the attributes and run the Near tool with the tide stations).
Data sources used in this process:
- hypbathytides_utm18.shp
- jd105_moredepths_utm18.shp
- tidestations_utm18.shp
Data sources produced in this process:
- hypbathytides_utm18.shp
- jd105_moredepths_utm18.shp
- Date: Apr-2010 (process 19 of 26)
-
In ArcMap 9.2, after using the Near tool and copying the distances to the tide stations into the appropriate fields, use the field calculator on the "wt_tide" attribute.
wt_tide = ((1-([dist_rose]/( [dist_rose] + [dist_inlet])))* [rd_tide]) + ((1-( [dist_inlet]/( [dist_rose] + [dist_inlet])))* [ir_tide])
This should give a weighted tide value for each point based on the distance to each tide station. Now apply the tide correction: tideadded = [depth_m] - [wt_tide]
Data sources used in this process:
- hypbathytides_utm18.shp
- jd105_moredepths_utm18.shp
Data sources produced in this process:
- hypbathytides_utm18.shp
- jd105_moredepths_utm18.shp
- Date: Apr-2010 (process 20 of 26)
-
Use topo2raster to generate a new grid. ArcMap 9.2 - ArcToolbox - Spatial Analyst Tools - Interpolation - Topo to Raster.
Inputs:
bounding_poly; ; boundary
jd105_moredepths_utm18; wttide_dep; PointElevation
hypbathytides_utm18; wttide_dep; PointElevation
output: bestbathy
cell size: 25
drainage enforcement: no_enforce
primary type of input data: spot
Everything else left to defaults.
The bounding polygon was a quick polygon created around the survey area to limit the extent of extrapolation in areas where no data were available.
Data sources used in this process:
- jd105_moredepths_utm18
- hypbathytides_utm18
Data sources produced in this process:
- Date: Apr-2010 (process 21 of 26)
-
Then filter the resulting grid:
ArcToolbox - Spatial Analyst Tools - Neighborhood - Filter
Input: bestbathy
Output: bestbat_fil
Filter type: low
Check the box next to ignore NODATA in calculations.
Data sources used in this process:
Data sources produced in this process:
- Date: Sep-2010 (process 22 of 26)
-
Used Spatial Analyst toolbar in ArcMap 9.2 - Raster Calculator and multiplied the grid bestbat_fil by -1. Then right mouse clicked on the calculation grid - Data - Make permanent with the output: bbat_filneg
Data sources used in this process:
Data sources produced in this process:
- Date: Dec-2010 (process 23 of 26)
-
The bathymetry grid was renamed to irb_bathy in ArcCatalog 9.2.
Data sources used in this process:
Data sources produced in this process:
- Date: 02-Oct-2017 (process 24 of 26)
-
Edits to the metadata were made to fix any errors that MP v 2.9.36 flagged. This is necessary to enable the metadata to be successfully harvested for various data catalogs. In some cases, this meant adding text "Information unavailable" or "Information unavailable from original metadata" for those required fields that were left blank. Other minor edits were probably performed (title, publisher, publication place, etc.). Attempted to modify http to https where appropriate. Moved the minimal source information provided to make it the first process step. The metadata date (but not the metadata creator) was edited to reflect the date of these changes. The metadata available from a harvester may supersede metadata bundled within a download file. Compare the metadata dates to determine which metadata file is most recent.
Person who carried out this activity:
U.S. Geological Survey
Attn: VeeAnn A. Cross
Marine Geologist
384 Woods Hole Road
Woods Hole,
MA
508-548-8700 x2251 (voice)
508-457-2310 (FAX)
vatnipp@usgs.gov
- Date: 20-Jul-2018 (process 25 of 26)
-
USGS Thesaurus keywords added to the keyword section.
Person who carried out this activity:
U.S. Geological Survey
Attn: VeeAnn A. Cross
Marine Geologist
384 Woods Hole Road
Woods Hole,
MA
508-548-8700 x2251 (voice)
508-457-2310 (FAX)
vatnipp@usgs.gov
- Date: 08-Sep-2020 (process 26 of 26)
-
Added keywords section with USGS persistent identifier as theme keyword.
Person who carried out this activity:
U.S. Geological Survey
Attn: VeeAnn A. Cross
Marine Geologist
384 Woods Hole Road
Woods Hole,
MA
508-548-8700 x2251 (voice)
508-457-2310 (FAX)
vatnipp@usgs.gov