Esri Binary grid of the bathymetry of Indian River Bay, Delaware, generated from fathometer data acquired in April 2010 during U.S. Geological Survey Field Activity 2010-006-FA (IRB_BATHY, UTM, Zone 18, WGS 84)

Metadata also available as - [Outline] - [Parseable text] - [XML]

Frequently anticipated questions:


What does this data set describe?

Title:
Esri Binary grid of the bathymetry of Indian River Bay, Delaware, generated from fathometer data acquired in April 2010 during U.S. Geological Survey Field Activity 2010-006-FA (IRB_BATHY, UTM, Zone 18, WGS 84)
Abstract:
A geophysical survey to delineate the fresh-saline groundwater interface and associated sub-bottom sedimentary structures beneath Indian River Bay, Delaware, was carried out in April 2010. This included surveying at higher spatial resolution in the vicinity of a study site at Holts Landing, where intensive onshore and offshore studies were subsequently completed. The total length of continuous resistivity profiling (CRP) survey lines was 145 kilometers (km), with 36 km of chirp seismic lines surveyed around the perimeter of the bay. Medium-resolution CRP surveying was performed using a 50-meter streamer in a bay-wide grid. Results of the surveying and data inversion showed the presence of many buried paleochannels beneath Indian River Bay that generally extended perpendicular from the shoreline in areas of modern tributaries, tidal creeks, and marshes. An especially wide and deep paleochannel system was imaged in the southeastern part of the bay near White Creek. Many paleochannels also had high-resistivity anomalies corresponding to low-salinity groundwater plumes associated with them, likely due to the presence of fine-grained estuarine mud and peats in the channel fills that act as submarine confining units. Where present, these units allow plumes of low-salinity groundwater that was recharged onshore to move beyond the shoreline, creating a complex fresh-saline groundwater interface in the subsurface. The properties of this interface are important considerations in construction of accurate coastal groundwater flow models. These models are required to help predict how nutrient-rich groundwater, recharged in agricultural watersheds such as this one, makes its way into coastal bays and impacts surface water quality and estuarine ecosystems. For more information on the survey conducted for this project, see https://cmgds.marine.usgs.gov/fan_info.php?fan=2010-006-FA.
  1. How might this data set be cited?
    Cross, VeeAnn A., 2014, Esri Binary grid of the bathymetry of Indian River Bay, Delaware, generated from fathometer data acquired in April 2010 during U.S. Geological Survey Field Activity 2010-006-FA (IRB_BATHY, UTM, Zone 18, WGS 84): Open-File Report 2011-1039, U.S. Geological Survey, Coastal and Marine Geology Program, Woods Hole Coastal and Marine Science Center, Woods Hole, MA.

    Online Links:

    This is part of the following larger work.

    Cross, V.A., Bratton, J.F., Michael, H.A., Kroeger, K.D., Green, A., and Bergeron, E., 2014, Continuous Resistivity Profiling and Seismic-Reflection Data Collected in April 2010 from Indian River Bay, Delaware: Open-File Report 2011-1039, U.S. Geological Survey, Reston, VA.

    Online Links:

  2. What geographic area does the data set cover?
    West_Bounding_Coordinate: -75.207119
    East_Bounding_Coordinate: -75.060608
    North_Bounding_Coordinate: 38.621180
    South_Bounding_Coordinate: 38.565589
  3. What does it look like?
    https://pubs.usgs.gov/of/2011/1039/data/basemap/bathy/irb_bathygrd.gif (GIF)
    Thumbnail GIF image of bathymetry in Indian River Bay. The coastline is included for spatial reference.
  4. Does the data set describe conditions during a particular time period?
    Beginning_Date: 12-Apr-2010
    Ending_Date: 15-Apr-2010
    Currentness_Reference:
    ground condition
  5. What is the general form of this data set?
    Geospatial_Data_Presentation_Form: raster digital data
  6. How does the data set represent geographic features?
    1. How are geographic features stored in the data set?
      This is a Raster data set. It contains the following raster data types:
      • Dimensions 246 x 510 x 1, type Grid Cell
    2. What coordinate system is used to represent geographic features?
      Grid_Coordinate_System_Name: Universal Transverse Mercator
      Universal_Transverse_Mercator:
      UTM_Zone_Number: 18
      Transverse_Mercator:
      Scale_Factor_at_Central_Meridian: 0.999600
      Longitude_of_Central_Meridian: -75.000000
      Latitude_of_Projection_Origin: 0.000000
      False_Easting: 500000.000000
      False_Northing: 0.000000
      Planar coordinates are encoded using row and column
      Abscissae (x-coordinates) are specified to the nearest 25.000000
      Ordinates (y-coordinates) are specified to the nearest 25.000000
      Planar coordinates are specified in meters
      The horizontal datum used is D_WGS_1984.
      The ellipsoid used is WGS_1984.
      The semi-major axis of the ellipsoid used is 6378137.000000.
      The flattening of the ellipsoid used is 1/298.257224.
      Vertical_Coordinate_System_Definition:
      Altitude_System_Definition:
      Altitude_Datum_Name: National Geodetic Vertical Datum of 1929
      Altitude_Resolution: 0.1
      Altitude_Distance_Units: meters
      Altitude_Encoding_Method: Implicit coordinate
  7. How does the data set describe geographic features?
    Entity_and_Attribute_Overview:
    This grid is in meters in the vertical datum of NGVD29. Based on the NOAA website (http://www.ngs.noaa.gov/cgi-bin/VERTCON/vert_con.prl accessed October 2010) the conversion from NGVD29 to NAVD88 is -0.238 meters at both the Indian River Inlet tide station and the Rosedale Beach tide station. This indicates a direct offset of the whole grid will convert to NAVD88. So taking this grid and subtracting 0.238 meters will yield a grid with the vertical datum NAVD88.
    Entity_and_Attribute_Detail_Citation: U.S. Geological Survey

Who produced the data set?

  1. Who are the originators of the data set? (may include formal authors, digital compilers, and editors)
    • VeeAnn A. Cross
  2. Who also contributed to the data set?
  3. To whom should users address questions about the data?
    VeeAnn A. Cross
    U.S. Geological Survey
    Marine Geologist
    Woods Hole Coastal and Marine Science Center
    Woods Hole, MA

    (508) 548-8700 x2251 (voice)
    (508) 457-2310 (FAX)
    vatnipp@usgs.gov

Why was the data set created?

The purpose of this dataset is to provide a tide corrected bathymetric grid for the Indian River Bay.

How was the data set created?

  1. From what previous works were the data drawn?
  2. How were the data generated, processed, and modified?
    Date: Apr-2010 (process 1 of 26)
    All the ASCII HYPACK navigation files collected were placed in their own folder on the computer based on day of collection. Each file has the following filename convention: LLL_TTTT.RAW where LLL is the line number and TTTT is the start time (UTC) of data collection in the format HHMM. In order to parse these files to extract navigation and depth information, two scripts were run under a Cygwin operating system. The first script "donav" is an executable shell script that cycles through all the *.RAW files in the folder and calls an AWK script to extract particular lines of information.
    donav:
    files=`ls *.RAW | cut -d. -f1`
    for file in $files
    do
    	awk -f awkit $file.RAW > $file.navdep
    done
    
    The AWK script "awkit" simply extracts any line of information in the HYPACK file that contains either the string "GPGGA" or "SDDBT" and writes this information to a new file with the extension navdep.
    awkit:
    {
    if ($0 ~ /GPGGA|SDDBT/) {
    	print $0
    	}
    }
    
    This process step and all subsequent process steps were performed by the same person - VeeAnn A. Cross. Person who carried out this activity:
    VeeAnn A. Cross
    U.S. Geological Survey
    Marine Geologist
    Woods Hole Coastal and Marine Science Center
    Woods Hole, MA

    (508) 548-8700 x2251 (voice)
    (508) 457-2310 (FAX)
    vatnipp@usgs.gov
    Data sources used in this process:
    • *.RAW
    Data sources produced in this process:
    • *.navdep
    Date: Apr-2010 (process 2 of 26)
    With the particular lines of interest extracted from the original HYPACK files, additional scripts were run to extract the specific information of interest. The next two scripts run are doholdhypack and awkholdhypack. The doholdhypack is a shell script run under Cygwin which cycles through all the *.navdep files in the folder and calls the awkholdhypack AWK script to process the file, with the results output to *.holdhypack. Of note, if the SDDBT record does not contain a depth value, a value of -9999 is written to the output file - acting as a nodata value.
    doholdhypack:
    files=`ls *.navdep | cut -d. -f1`
    for file in $files
    do
    	awk -f awkholdhypack $file.navdep > $file.holdhypack
    done
    
    The AWK script awkholdhypack:
    BEGIN {
    FS = ","
    }
    {
    FS = ","
    depth = -9999
    if ($1 ~ /GPGGA/)
    	{
    	utctime = $2
    	latdeg = substr($3,1,2)
    	latmin = substr($3,3,6)
    	declat = latdeg + (latmin/60)
    	londeg = substr($5,1,3)
    	lonmin = substr($5,4,6)
    	declon = -1 * (londeg + (lonmin/60))
    	if (NR==1) {
    		holddepth = -9999
    		}
    	else {
    		printf("%s, %9.6f, %9.6f, %5.1f\n", holdutctime, holddeclon, holddeclat, holddepth)
    	}
    	holdutctime = utctime
    	holdutcdate = utcdate
    	holddeclon = declon
    	holddeclat = declat
    	holddepth = -9999
    	}
    if ($1 ~ /SDDBT/)
    	{
    	if ($4 != "")
    		{
    		depthreal = $4
    		holddepth = depthreal
    		}
    	else
    		{
    		depthreal = -9999
    		holddepth = -9999
    		}
    	}
    }
    END {
    printf("%s, %9.6f, %9.6f, %5.1f\n", holdutctime, holddeclon, holddeclat, holddepth)
    }
    
    Data sources used in this process:
    • *.navdep
    Data sources produced in this process:
    • *.holdhypack
    Date: Apr-2010 (process 3 of 26)
    With each individual HYPACK file processed, all the individual files on a given day were concatenated into a single file. Because the file naming convention starts with the line number, and surveying wasn't done in numerical order of lines, the appending of the files needs to be in the order the files were acquired. The time is in the filename, but another additional file written during acquisition was a LOG file containing the order of the lines. By editing this file and converting it to a shell script, the individual files can be concatenated in chronological order on each day. Running the shell script for each day under Cygwin successfully concatenated the files. What follows is the example from Julian day 103:
    cat 201_1213.holdhypack \
    202_1414.holdhypack \
    203_1441.holdhypack \
    204_1600.holdhypack > jd103hypack.csv
    
    Data sources used in this process:
    • *.holdhypack
    Data sources produced in this process:
    • jd103hypack.csv
    • jd104hypack.csv
    • jd105hypack.csv
    Date: Apr-2010 (process 4 of 26)
    Using VI editor, under Cygwin, the resulting CSV file for each day was edited to add the header line "gpstime, longitude, latitude, depth_m". With this header line added, the CSV file can be imported to ArcMap 9.2 as an event theme using Tools - Add XY data and defining the projection as Geographic, WGS84. This event theme is then converted to a point shapefile by right mouse click - data - export and saving a shapefile for each HYPACK event theme. Data sources used in this process:
    • jd103hypack.csv
    • jd104hypack.csv
    • jd105hypack.csv
    Data sources produced in this process:
    • jd103hypack.shp
    • jd104hypack.shp
    • jd105hypack.shp
    Date: Apr-2010 (process 5 of 26)
    The table of this shapefile was edited in ArcMap 9.2 to add two additional attributes: gpsdate and jday. These attributes were populated with the appropriate information for each day of data collection. Data sources used in this process:
    • jd103hypack.shp
    • jd104hypack.shp
    • jd105hypack.shp
    Data sources produced in this process:
    • jd103hypack.shp
    • jd104hypack.shp
    • jd105hypack.shp
    Date: Apr-2010 (process 6 of 26)
    Only the valid depth values are needed to create the bathymetry grid, so within ArcMap 9.2 under the shapefile properties, apply a definition query on the shapefile: "depth_m" <> -9999. Then use XTools Pro 5.2 to export the shapefile to a text file. Fields for export: gpstime, longitude, latitude, depth, jday. Save as type text, ANSI encoding. Data sources used in this process:
    • jd103hypack.shp
    • jd104hypack.shp
    • jd105hypack.shp
    Data sources produced in this process:
    • jd103_exp.txt
    • jd104_exp.txt
    • jd105_exp.txt
    Date: Apr-2010 (process 7 of 26)
    Use an AWK script to reformat each text file into the format needed for the MATLAB scripts that will be used for tide corrections. The AWK script awkbathy_fmt:
    BEGIN {
    FS=","
    }
    {
    FS=","
    hr=substr($1,1,2)
    min=substr($1,3,2)
    sec=substr($1,5,2)
    jday = $5
    longitude = $2
    latitude = $3
    depth = $4
    printf("%s\t%s\t%s\t%s\t%9.6f\t%9.6f\t%3.1f\n", jday, hr, min, sec, longitude, latitude, depth)
    }
    
    Data sources used in this process:
    • jd103_exp.txt
    • jd104_exp.txt
    • jd105_exp.txt
    Data sources produced in this process:
    • jd103_exp.dat
    • jd104_exp.dat
    • jd105_exp.dat
    Date: Apr-2010 (process 8 of 26)
    One day of equipment testing (JD102 - April 12, 2010) acquired resistivity data, but no HYPACK data. Additionally, in the western portion of the bay on JD105 (April 15) the HYPACK system was not recording, but navigation with bathymetry was being recorded by the CRP system during line L21F1 and L22F1. Although the test line on JD102 was not fully processed, the GPS file from that day was processed in the same manner as the other GPS files from the CRP data collection. An example of how the GPS files are handled can be read in the metadata for jd103gps_bestdepth.shp available from https://pubs.usgs.gov/of/2011/1039/html/ofr2011-1039-catalog.html. The processed shapefile from JD102, as well as selected records from JD105 (between GPS times 133421 and 140157) were exported using XTools Pro v 5.2 with the fields being exported - gpstime; longitude, latitude, depth_m and jday. Data sources used in this process:
    • jd102.shp
    • jd105gps_spatjoin.shp
    Data sources produced in this process:
    • tstbathy.txt (JD102 resistivity portion)
    • jd105gps_spatjoin_extradepths_exp.txt
    Date: Apr-2010 (process 9 of 26)
    Each of the resistivity acquired bathymetry values were processed with the same AWK script (awkbathy_fmt) to generate files in the appropriate format for the MATLAB tide processing scripts.
    awkbathy_fmt
    BEGIN {
    FS=","
    }
    {
    FS=","
    hr=substr($1,1,2)
    min=substr($1,3,2)
    sec=substr($1,5,2)
    jday = $5
    longitude = $2
    latitude = $3
    depth = $4
    printf("%s\t%s\t%s\t%s\t%9.6f\t%9.6f\t%3.1f\n", jday, hr, min, sec, longitude, latitude, depth)
    }
    
    Data sources used in this process:
    • tstbathy.txt (JD102 resistivity portion)
    • jd105gps_spatjoin_extradepths_exp.txt
    Data sources produced in this process:
    • tstbathy.dat
    • bathy.dat (JD105 formatted bathymetry)
    Date: Apr-2010 (process 10 of 26)
    All of the HYPACK formatted bathymetry data and the resistivity data from JD102 were concatenated together in Cygwin using the command: cat tstbathy.dat jd103_hyp.txt jd104_hyp.txt jd105_hyp.txt > bathy.dat. The data from JD105 resistivity collection was handled separately since that gap wasn't noticed until later. Data sources used in this process:
    • tstbathy.dat
    • jd103_hyp.txt
    • jd104_hyp.txt
    • jd105_hyp.txt
    Data sources produced in this process:
    • bathy.dat
    Date: Apr-2010 (process 11 of 26)
    In order to tide correct the bathymetry, data from tide stations in the area was required. There are two tide stations in the Indian River Bay area. One is from Rosedale Beach, DE, and the other is from the Indian River Inlet. The Rosedale Beach tides are available from: https://waterdata.usgs.gov/usa/nwis/uv?site_no=01484540 and the Indian River Inlet tides are available from: https://waterdata.usgs.gov/de/nwis/uv/?site_no=01484683&. In both cases the tab-delimited format was downloaded for all the days covering the CRP data acquisition. For both tide stations it's important to note that the time recorded in the tide files is in local time - Eastern Standard Time (EST), even though at the time of interest (April 12-15, 2010) local time is actually EDT (Eastern Daylight Time). The offset between EST and UTC is -5 hours (to go from EST to UTC). Other pertinent information from the website for the Rosedale Beach tide station: LOCATION.--Lat 38deg35min29.5sec, long. 75deg12min44.7sec, Sussex County, DE, Hydrologic Unit 02060010, on left bank attached to a privately owned fishing pier, at Seals Point, 1.9 miles west of Oak Orchard. DRAINAGE AREA.--Not determined. PERIOD OF RECORD.--April 1991 to current year. GAGE.--Water-stage recorder. Datum of gage is 0.0 ft above National Geodetic Vertical Datum of 1929. REMARKS.--This is a tidal station. Discharge are not determined at this location. U.S. Geological Survey satellite collection platform at station.
    Other pertinent information from the website for the Indian River Inlet tide station: LOCATION.--Lat 38deg36min35.4sec, long. 75deg04min04.8sec, Sussex County, DE, Hydrologic Unit 02060010, 0.3 mi northwest of the Indian River Inlet, 0.2 mi west of State Highway 1, 4.9 mi north of Bethany Beach and at the Indian River Coast Guard station. DRAINAGE AREA.--Not determined. PERIOD OF RECORD.--June 1988 to June 1989, April 1991 to current year. GAGE.--Water-stage recorder. Datum of gage is 0.0 ft above National Geodetic Vertical Datum of 1929. Mean Low-Low Water: Gage Datum + 1.51 ft. REMARKS.--This is a tidal station. Discharge are not determined at this location. U.S. Geological Survey satellite collection platform at station. Data sources produced in this process:
    • rosedale_beach.txt
    • indianriver_inlet.txt
    Date: Apr-2010 (process 12 of 26)
    Using VI under Cygwin removed the header in the text file of the tide station data. Data sources used in this process:
    • rosedale_beach.txt
    • indianriver_inlet.txt
    Data sources produced in this process:
    • rosedale_beach_noheader.txt
    • indianriver_inlet_noheader.txt
    Date: Apr-2010 (process 13 of 26)
    The tide data needs to be reformatted to work with the MATLAB processing scripts. In order to do this, process the files through an AWK script awktide_fmt:
    {
    year = substr($3,1,4)
    month = substr($3,6,2)
    day = substr($3,9,2)
    hr = substr($4,1,2)
    min = substr($4,4,2)
    utchr = hr + 5
    tideval = $6
    tideval_meters = tideval * 0.3048
    printf("1\t%s\t%s\t%02d\t%02d\t%02d\t%02d\t%02.4f\t%2.3f\n", $2, year, >month, day, utchr, min, tideval_meters, tideval)
    }
    
    The first element written to the output file is the number 1. The value of this number isn't important, but the MATLAB script does expect the first element to be a number. Another important piece of information about this AWK script is that 5 is added to the hours in the tide file to convert from EST to UTC to be compatible with the bathymetry data. Also, the original tide data was recorded in feet, but the work I'm doing is in meters. So that calculation is also taken care of in the AWK script. Data sources used in this process:
    • rosedale_beach_noheader.txt
    • indianriver_inlet_noheader.txt
    Data sources produced in this process:
    • rd_tides.dat
    • ir_tides.dat
    Date: Apr-2010 (process 14 of 26)
    MATLAB version 7.5.0.342 (R2007b) was used to process the tide stations and bathymetry. The commands issued in MATLAB rely on MATLAB m files written by the USGS in Woods Hole, MA. The series of scripts used are: julian.m, linterp.m, gregorian.m, s2hms.m, and hms2h.m. There are actually two separate bathymetry files (both called bathy.dat) in separate folders that are being processed. One is the HYPACK data plus the CRP bathymetry data from Julian day 102. The other bathy.dat file represents the bathymetry extracted from Julian day 105 when the HYPACK system was not recording. So this series of commands is actually run twice - in the two separate folders. The following MATLAB commands were issued to process the data including comments explaining some of the processes. The comment lines are prefaced with the "%".
    %Step 1. Outside of Matlab, format the ASCII tide gauge data.
    %Load the tide data.
    load ir_tides.dat
    load rd_tides.dat
    greg_ir=ir_tides(:,3:7);  %year month day hour minute needs to be in columns 3,4,5,6,7
    greg_ir(:,6)=zeros(length(greg_ir(:,1)),1);  %adding secs bc the tide file doesn't have any
    greg_rd=rd_tides(:,3:7);
    greg_rd(:,6)=zeros(length(greg_rd(:,1)),1);
    %then establish the julian timebase for the observed tide data
    jd_ir=julian(greg_ir);
    jd_rd=julian(greg_rd);
    h_ir=ir_tides(:,8);  %observed tide in meters (in this case above NGVD29)
    h_rd=rd_tides(:,8);
    %Step 2: outside of Matlab I reformatted my bathy data.  The original data was exported
    %from a shapefile using XTools and then needed to be reformatted for my needs here
    load bathy.dat %there are 2 bathy.dat file in 2 separate folders
    jd0=julian([2010 1 1 0 0 0]);  %starting point for yearday
    %setting the julian time base for the ship stuff
    %this assumes column 1=jd, 2=hour, 3=min, 4=sec
    jdtrack=jd0+bathy(:,1)-1+(bathy(:,2)+bathy(:,3)/60+bathy(:,4)/3600)/24;
    x=bathy(:,5);  %longitude in decimal degrees in column 5
    y=bathy(:,6);  %latitude in decimal degrees in column 6
    htrack=bathy(:,7);
    %Step 3: interpolate observed tide data onto the ship track timebase using a linear
    %interpolation. Splining the tide data could result in overshoots
    tidetrack_ir=linterp(jd_ir,h_ir,jdtrack);
    tidetrack_rd=linterp(jd_rd,h_rd,jdtrack);
    h_corrected_ir=htrack-tidetrack_ir;
    h_corrected_rd=htrack-tidetrack_rd;
    %Step 4: output the information I want into a text file.  I decided I basically want my
    %original bathymetry data, plus the tide value and the corrected value
    %I have to use the ' to get the columns flipped so the append thing works
    alltidestuff=[bathy'; tidetrack_ir'; h_corrected_ir';tidetrack_rd';h_corrected_rd'];
    outfile=['veetest'];
    fid=fopen(outfile,'w');
    fprintf(fid,'%03d:%02d:%02d:%02d, %9.6f, %9.6f, %3.1f, %3.1f, %3.1f, %3.1f, %3.1f \n',alltidestuff);
    fclose(fid);
    
    The result is a comma-delimited text file with 8 columns of information: jdtime, longitude, latitude, depth_m, ir_tide, ir_cor, rd_tide, rd_cor The definition of this header information is as follows:
    jdtime: julian day and time from the GPS
    longitude: longitude from the GPS
    latitude: latitude from the GPS
    depth_m: depth in meters of the fathometer (as recorded by the GPS)
    ir_tide: interpolated tide values at the GPS time from the Indian River Inlet tide station
    ir_cor: the corrected depth in meters of the fathometer based on the Indian River tide value (depth_m - ir_tide)
    rd_tide: interpolated tide values at the GPS time from the Rosedale Beach tide station
    rd_cor: the corrected depth in meters of the fathometer based on the Rosedale tide value (depth_m - rd_tide)
    
    The header line was added to the veetest file and saved under a new file name: hypbathytides.csv and jd105_moredepths.csv Data sources used in this process:
    • bathy.dat
    Data sources produced in this process:
    • hypbathytides.csv
    • jd105_moredepths.csv
    Date: Apr-2010 (process 15 of 26)
    Each of the resulting CSV files were added to ArcMap 9.2 as event themes: Tools - Add XY Data. The projection was defined as Geographic, WGS84. These event themes were then converted to shapefiles - right mouse click on the event theme - Data Export. Data sources used in this process:
    • hypbathytides.csv
    • jd105_moredepths.csv
    Data sources produced in this process:
    • hypbathytides.shp
    • jd105_moredepths.shp
    Date: Apr-2010 (process 16 of 26)
    The resulting shapefiles are geographic, in order to do a weighted tide influence on each point, the point shapefiles are projected to UTM, Zone 18 so that distance values are in meters. Using ArcMap 9.2 - ArcToolbox - Data Management Tools - Projections and Transformations - Feature. Input each geographic shapefile, and output a new shapefile in a UTM, Zone 18, WGS84 projection - no transformation necessary. Data sources used in this process:
    • hypbathytides.shp
    • jd105_moredepths.shp
    Data sources produced in this process:
    • hypbathytides_utm18.shp
    • jd105_moredepths_utm18.shp
    Date: Apr-2010 (process 17 of 26)
    The tide station locations are known, so that latitude/longitude information was added to a comma-delimited text file and added to ArcMap 9.2 as an event theme and converted to a shapefile. Within ArcMap 9.2, the Geographic, WGS84 shapefile was projected to a UTM, Zone 18, WGS84 shapefile. Data sources used in this process:
    • tidesta_locs.csv
    Data sources produced in this process:
    • tidestations.shp
    • tidestations_utm18.shp
    Date: Apr-2010 (process 18 of 26)
    The UTM tide station shapefile and the two shapefiles with the bathymetry and tide information were added to ArcMap 9.2. One tide station (Indian River Inlet) is at the eastern end of the study area, while the other tide station (Rosedale Beach) is at the western end of the study area. Instead of just using the adjustments from one tide station, a weighted tide adjustment was applied based on the distance from the tide stations. In order to get the distance to each tide station in the bathymetry shapefiles, the Near tool will be used (ArcToolbox - Analysis Tools - Proximity - Near). Prior to using the tool, add the following attributes to the shapefile (as type double): dist_inlet; dist_rose; wt_tide; wttide_dep. With these attributes added, apply a definition query to the tide station shapefile so only one station is present. The Near tool will only report the distance to the closest tide station if two are available. To get the distance to each tide station so that a weighted tide value can be calculated based on distance to each tide station, the Near tool will need to be run twice - with only one station available each time. So with one tide station visible, use the near tool with the following parameters: input-hypbathytides_utm18; near features - tidestations_utm18. leave everything else at default (don't use a search radius). Then copy the "near_dist" attribute (automatically created by the Near tool) to the appropriate distance attribute created for that station. Repeat the same process for the other tide station. Repeat the same process with the jd105_moredepths_utm18.shp (add the attributes and run the Near tool with the tide stations). Data sources used in this process:
    • hypbathytides_utm18.shp
    • jd105_moredepths_utm18.shp
    • tidestations_utm18.shp
    Data sources produced in this process:
    • hypbathytides_utm18.shp
    • jd105_moredepths_utm18.shp
    Date: Apr-2010 (process 19 of 26)
    In ArcMap 9.2, after using the Near tool and copying the distances to the tide stations into the appropriate fields, use the field calculator on the "wt_tide" attribute.
    
    
    wt_tide = ((1-([dist_rose]/( [dist_rose] + [dist_inlet])))* [rd_tide]) + ((1-( [dist_inlet]/( [dist_rose] + [dist_inlet])))* [ir_tide])
    
    
    This should give a weighted tide value for each point based on the distance to each tide station. Now apply the tide correction: tideadded = [depth_m] - [wt_tide] Data sources used in this process:
    • hypbathytides_utm18.shp
    • jd105_moredepths_utm18.shp
    Data sources produced in this process:
    • hypbathytides_utm18.shp
    • jd105_moredepths_utm18.shp
    Date: Apr-2010 (process 20 of 26)
    Use topo2raster to generate a new grid. ArcMap 9.2 - ArcToolbox - Spatial Analyst Tools - Interpolation - Topo to Raster.
    Inputs:
    bounding_poly; ; boundary
    jd105_moredepths_utm18; wttide_dep; PointElevation
    hypbathytides_utm18; wttide_dep; PointElevation
    output: bestbathy
    cell size: 25
    drainage enforcement: no_enforce
    primary type of input data: spot
    
    Everything else left to defaults.
    
    
    The bounding polygon was a quick polygon created around the survey area to limit the extent of extrapolation in areas where no data were available. Data sources used in this process:
    • jd105_moredepths_utm18
    • hypbathytides_utm18
    Data sources produced in this process:
    • bestbathy
    Date: Apr-2010 (process 21 of 26)
    Then filter the resulting grid: ArcToolbox - Spatial Analyst Tools - Neighborhood - Filter
    Input: bestbathy
    Output: bestbat_fil
    Filter type: low
    Check the box next to ignore NODATA in calculations.
    
    Data sources used in this process:
    • bestbathy
    Data sources produced in this process:
    • bestbat_fil
    Date: Sep-2010 (process 22 of 26)
    Used Spatial Analyst toolbar in ArcMap 9.2 - Raster Calculator and multiplied the grid bestbat_fil by -1. Then right mouse clicked on the calculation grid - Data - Make permanent with the output: bbat_filneg Data sources used in this process:
    • bestbat_fil
    Data sources produced in this process:
    • bbat_filneg
    Date: Dec-2010 (process 23 of 26)
    The bathymetry grid was renamed to irb_bathy in ArcCatalog 9.2. Data sources used in this process:
    • bbat_filneg
    Data sources produced in this process:
    • irb_bathy
    Date: 02-Oct-2017 (process 24 of 26)
    Edits to the metadata were made to fix any errors that MP v 2.9.36 flagged. This is necessary to enable the metadata to be successfully harvested for various data catalogs. In some cases, this meant adding text "Information unavailable" or "Information unavailable from original metadata" for those required fields that were left blank. Other minor edits were probably performed (title, publisher, publication place, etc.). Attempted to modify http to https where appropriate. Moved the minimal source information provided to make it the first process step. The metadata date (but not the metadata creator) was edited to reflect the date of these changes. The metadata available from a harvester may supersede metadata bundled within a download file. Compare the metadata dates to determine which metadata file is most recent. Person who carried out this activity:
    U.S. Geological Survey
    Attn: VeeAnn A. Cross
    Marine Geologist
    384 Woods Hole Road
    Woods Hole, MA

    508-548-8700 x2251 (voice)
    508-457-2310 (FAX)
    vatnipp@usgs.gov
    Date: 20-Jul-2018 (process 25 of 26)
    USGS Thesaurus keywords added to the keyword section. Person who carried out this activity:
    U.S. Geological Survey
    Attn: VeeAnn A. Cross
    Marine Geologist
    384 Woods Hole Road
    Woods Hole, MA

    508-548-8700 x2251 (voice)
    508-457-2310 (FAX)
    vatnipp@usgs.gov
    Date: 08-Sep-2020 (process 26 of 26)
    Added keywords section with USGS persistent identifier as theme keyword. Person who carried out this activity:
    U.S. Geological Survey
    Attn: VeeAnn A. Cross
    Marine Geologist
    384 Woods Hole Road
    Woods Hole, MA

    508-548-8700 x2251 (voice)
    508-457-2310 (FAX)
    vatnipp@usgs.gov
  3. What similar or related data should the user be aware of?

How reliable are the data; what problems remain in the data set?

  1. How well have the observations been checked?
  2. How accurate are the geographic locations?
    The navigation system used was a Lowrance 480M with an LGC-2000 Global Positioning System (GPS) antenna. The antenna was located directly above the fathometer transducer mount point.
  3. How accurate are the heights or depths?
    All collected bathymetry values were collected by the 200 kHz Lowrance fathometer. The fathometer was mounted starboard side aft, directly below the GPS antenna and the resistivity streamer tow point. The Lowrance manufacturer indicates the speed of sound used by the system to convert to depths is 4800 feet/second. All values are assumed to be accurate to within 1 meter. The gridding, grid smoothing, and other processing affects the accuracy of individual cell values as these data were not collected with the intention of generating a grid and have areas of sparse coverage.
  4. Where are the gaps in the data? What is missing?
    This grid represents all the valid fathometer readings recorded by the HYPACK navigation software, plus some bathymetric readings recorded by the continuous resistivity profile system when the HYPACK system was not logging data.
  5. How consistent are the relationships among the observations, including topology?
    All the data were evaluated with the same criteria.

How can someone get a copy of the data set?

Are there legal restrictions on access or use of the data?
Access_Constraints: None.
Use_Constraints:
Not to be used for navigation. The public domain data from the U.S. Government are freely redistributable with proper metadata and source attribution. Please recognize the U.S. Geological Survey as the originator of the dataset.
  1. Who distributes the data set? (Distributor 1 of 1)
    VeeAnn A. Cross
    U.S. Geological Survey
    Marine Geologist
    Woods Hole Coastal and Marine Science Center
    Woods Hole, MA

    (508) 548-8700 x2251 (voice)
    (508) 457-2310 (FAX)
    vatnipp@usgs.gov
  2. What's the catalog number I need to order this data set? Downloadable Data
  3. What legal disclaimers am I supposed to read?
    Neither the U.S. government, the Department of the Interior, nor the USGS, nor any of their employees, contractors, or subcontractors, make any warranty, express or implied, nor assume any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, nor represent that its use would not infringe on privately owned rights. The act of distribution shall not constitute any such warranty, and no responsibility is assumed by the USGS in the use of these data or related materials. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.
  4. How can I download or order the data?
  5. What hardware or software do I need in order to use the data set?
    This WinZip file contains data available in Esri binary grid format. The user must have software capable of uncompressing the WinZip file and reading/displaying the grid.

Who wrote the metadata?

Dates:
Last modified: 08-Sep-2020
Metadata author:
VeeAnn A. Cross
U.S. Geological Survey
Marine Geologist
Woods Hole Coastal and Marine Science Center
Woods Hole, MA

(508) 548-8700 x2251 (voice)
(508) 457-2310 (FAX)
vatnipp@usgs.gov
Metadata standard:
FGDC Content Standards for Digital Geospatial Metadata (FGDC-STD-001-1998)

This page is <https://cmgds.marine.usgs.gov/catalog/whcmsc/open_file_report/ofr2011-1039/irb_bathymeta.faq.html>
Generated by mp version 2.9.50 on Tue Sep 21 18:20:35 2021