Data processing and editing at sea
Processing of the data was carried out by USGS personnel with assistance from the Ocean Mapping Group at the University of New Brunswick (UNB), Canada. A suite of processing software (called swathed) developed by the Ocean Mapping Group (<www.omg.unb.ca/~jhc/SwathEd.html>) was used to process, edit, and archive the bathymetric soundings.
The processing and editing steps carried out on board the ship were:
1.. Demultiplex (unravel) the Simrad data files using RT to generate separate files for a given raw file (filename.raw_all) containing navigation (filename.nav), depth soundings (filename.merged), sidescan sonar backscatter values (filename.merged_ssdata), acquisition parameters (filename.param) sound velocity at the transducer (filename.sv_tdcr) and sound velocity profiler information (filename.svp).
Command line: RT -packdown -background -WRITE -em1000 -CnC -prefix RawFiles/ -suffix _raw.all -out ProcessedFiles/InputFilenamePrefix InputFilenamePrefix
2. Edit the navigation data on-screen using the SwathEd routine jview to remove undesirable points, including turns at the ends of survey lines. Jview also rejects stray GPS fixes outside of the survey area, as set by the operator.
Command line: jview -rejectnav -navedit filename.nav MassbayMap.sun_315
3. Edit the multibeam soundings for each trackline. SwathEd displays blocks of data across and along the trackline. Anomalous points were identified by comparison to other points and by an understanding of the sea floor geology and morphology. Anomalous soundings were removed.
Command line: swathed filename.merged
4. Map the bathymetric soundings from each processed data file onto a Mercator grid or DTM using weigh_grid with grid nodes spaced at 6 meters. The weigh_grid program creates a DTM by summing up the weighted contributions of all the provided data into the DTM. The weighted contributions only extend to a user defined region around the true location of the estimate (the cutoff). The weight of an individual point contribution decays away from the node location based on the order of the Butterworth (inverse weighting) filter (-power n), the width of the flat topped radius of the weighting function (-lambda n.n), and the absolute cutoff limit (-cutoff) that will be used for data points contributing to a node. In addition, each beam/other value can be pre-weighted if required to account for its reliability. A custom weight file for each beam is created by hand and input into the weigh_grid program using the -custom_weight option. Three files are created which are used by the weigh_grid program. An ".r4" file, which contains the grid node values in binary format, an "r4_weights" file, and an "r4_weight_depth" file. To do the summing of the weights, the .r4_weights file contains the sums of the weights for each data point contributing to the node, and the .r4_weight_depth file contains the sum of the weights x the depths. Naturally the solution is: weight_depth/weights for each node. For more information, visit the OMG website at <
http://www.omg.unb.ca>.
First a blank binary grid and the weights and weight_depth files must be created:
Command line: make_blank -float gridFile
Command line: tor4 gridFile
These two steps create all three files with a gridFile prefix.
Next, each individual data file (.merged suffix) is added to the grid using weigh_grid. The western Massachusetts Bay Quadrangles 1-3 were created with a grid node spacing of 6 meters, a cutoff radius of 12 meters, and an inner radius to the weighting function of 3 meters:
Command line: weigh_grid -omg -tide -coeffs -mindep -2 -maxdep -800 -beam_mask -beam_weight -custom_weight EM1000_Weights -butter -power 2 -cutoff 12 -lambda 3 gridFile filename.merged