12.10.2013 Views

Tools to convert, debug and - FEFlow

Tools to convert, debug and - FEFlow

Tools to convert, debug and - FEFlow

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

FEFLOW Data <strong>Tools</strong> –<br />

<strong>Tools</strong> <strong>to</strong> <strong>convert</strong>, <strong>debug</strong> <strong>and</strong> modify FEFLOW import <strong>and</strong> export files<br />

1. General information<br />

Scripts for everyday data preparation work...<br />

This collection of utilities had been created at WASY GmbH <strong>to</strong> help in the everyday work with<br />

FEFLOW solving problems in the fields of<br />

• data <strong>debug</strong>ging<br />

• data conversion<br />

• data statistics<br />

• data selection <strong>and</strong> extraction<br />

Operations that would typically be done on a PC with st<strong>and</strong>ard spreadsheet software packages<br />

can be performed with these <strong>to</strong>ols without limitations of line numbers.<br />

The <strong>to</strong>ols are UNIX shell scripts. They are using the ‘awk’ <strong>and</strong> ‘nawk’ or ‘gawk’ comm<strong>and</strong>. ‘awk’<br />

<strong>and</strong> ‘nawk’ come with the UNIX operating system. For ‘gawk’, please refer <strong>to</strong> the GNU software<br />

distributions <strong>and</strong> consider the public GNU license. These <strong>to</strong>ols offer universal possibilities of file<br />

data operations.<br />

Running on UNIX systems:<br />

The FEFLOW Data <strong>Tools</strong> are directly executable on all UNIX systems with installed ‘awk’ <strong>and</strong><br />

‘nawk’ comm<strong>and</strong>s.<br />

Running on WINDOWS systems:<br />

Start the <strong>to</strong>ols under an UNIX shell emulation program like the ‘bash’ in the CYGWIN package.<br />

CYGWIN can be installed from the win32/cygwin direc<strong>to</strong>ry of the FEFLOW DVD. No compilation<br />

is necessary. The programs are the same for all hardware platforms. The FEFLOW Data <strong>Tools</strong><br />

are plain ASCII files. The user can read it <strong>to</strong> underst<strong>and</strong> the functionality. You are invited <strong>to</strong><br />

adapt the <strong>to</strong>ols <strong>to</strong> your specific needs. Hints for ‘awk’ programming finds in the ‘awk’, ‘nawk’ <strong>and</strong><br />

‘gawk’ manual pages.<br />

Copyright <strong>and</strong> warranty<br />

There is no kind of copyright on the FEFLOW Data <strong>Tools</strong>. WASY provides no software support<br />

for these <strong>to</strong>ols. There is absolutely no warranty on the correct function of the <strong>to</strong>ols! If you use<br />

third party software like UNIX shell emulation software you have <strong>to</strong> respect the licensing<br />

conditions of this software.<br />

UNIX <strong>and</strong> MSDOS files<br />

It’s possible <strong>to</strong> read MSDOS files as well as UNIX files. The default program output comes on<br />

UNIX machines with UNIX linefeed <strong>and</strong> on a PC with MS-DOS line feed. Using the ‘-msdos’<br />

parameter on UNIX machines or the ‘-unix’ parameter on a PC you can change the output file


format. So you can use the <strong>to</strong>ols <strong>to</strong> <strong>convert</strong> your files between the different platforms.<br />

Help messages<br />

about program usage <strong>and</strong> function come by typing the pure program name or the name with the<br />

parameter ‘-h’. If you are using wrong parameters the program usage occurs.<br />

Program usage basics<br />

The UNIX typical syntax for program parameters will be used:<br />

value :A simple value (e.g., file name) has <strong>to</strong> be enclosed with blanks. Write the<br />

value without quotes.<br />

-a :Parameter without values: You have <strong>to</strong> type it without a blank between the minus<br />

sign <strong>and</strong> the parameter name.<br />

[-a] :Parameters between brackets are optional.<br />

-a | -b | -c :Vertical lines differ between alternative parameters. Only one of the<br />

parameters ‘-a’, ‘-b’ or ‘-c’ can be used.<br />

-a value :Parameters with a numerical value: You have <strong>to</strong> type it without a blank<br />

between the minus-sign <strong>and</strong> the name but with a blank between the name <strong>and</strong> the<br />

value. Write the value without quotes.<br />

-a ‘string’ :Parameters with a string value: You have <strong>to</strong> type without a blank between<br />

the minus sign <strong>and</strong> the name but with a blank between the name <strong>and</strong> the string. Write<br />

the string with quotes.<br />

value ... :List of values. The number of values is arbitrary. Between the values has <strong>to</strong><br />

be a blank.<br />

The program output...<br />

comes <strong>to</strong> stdout (in<strong>to</strong> the calling window). So you have the possibility <strong>to</strong> observe the program<br />

results.<br />

The input files will never be changed. To write the output in<strong>to</strong> a file you have <strong>to</strong> use the UNIXlike<br />

redirections.<br />

For example, a ‘bash’ or ‘sh’ user can write:<br />

trp2pnt -f 0.01 test1_head.trp<br />

Write <strong>to</strong> stdout (= output in<strong>to</strong> the calling window).<br />

trp2pnt -f 0.01 test1_head.trp > test1.pnt<br />

Create <strong>and</strong> write a file ‘test1.pnt’. If the file already exists, the program overwrites the<br />

file.<br />

trp2pnt -f 0.01 test1_head.trp >> test1.pnt<br />

Create a file ‘test1.pnt’ or append <strong>to</strong> the file if it already exists.


Further development of the FEFLOW data <strong>to</strong>ols<br />

WASY is collecting FEFLOW data <strong>to</strong>ols, also user-developed <strong>to</strong>ols. This pool will be published<br />

on the following FEFLOW DVD’s. That’s why we are interested in your modifications <strong>and</strong><br />

developments. You are also invited <strong>to</strong> send us your critical remarks as well as your problems.<br />

Please contact support@wasy.de.


2. Overview of the FEFLOW Data <strong>Tools</strong><br />

dar2pow<br />

[-p ‘H’| ‘C’| ‘T’] [-id start_id] dar_file observation_point_id [observation_point_id ...]<br />

dar2trp<br />

[-p ‘H’| ‘C’| ‘T’] [-s stepnumber] dar_file<br />

<strong>debug</strong>_pow<br />

[-discard] [-ci ‘ID_condition_string’] [-cp ‘Point_condition_string’] [-o operation_string’] pow_file<br />

<strong>debug</strong>_trp<br />

[-discard] [-comment] [-c ‘condition_string’] [-o ‘operation_string’] trp_file<br />

describe_pow<br />

[-ci ‘ID_condition_string’] [-cp ‘Point_condition_string’] pow_file<br />

describe_trp<br />

trp_file<br />

grid2trp<br />

grid_file<br />

mix_pnt+trp<br />

[-s snap_distance] ascii_point_file trp_file<br />

mix_pow<br />

[-comment] [-discard | -merge] [-o ‘operation_string’] pow_file1 pow_file2<br />

mix_pow_id<br />

pow_file id_file<br />

mix_pow_tab<br />

pow_file tab_file<br />

mix_trp<br />

[-discard | -merge] [-comment] [-s snap_distance] [-o ‘operation_string’] trp_file1 trp_file2<br />

opera_trp<br />

[-o ‘operation_string’] trp_file1 trp_file2<br />

ply_deldouble<br />

[-comment] [-s snap_distance] trp_file<br />

pnt2trp<br />

[-sort] [-f fac<strong>to</strong>r] ascii_point_file<br />

set_lin_id<br />

[-id ID-value] ascii_line_file<br />

tin2trp<br />

[-sort] [-f fac<strong>to</strong>r] ascii_tin_file<br />

trp2pow<br />

[-id start_id] [-o ‘operation_string’] [-s snap_distance] definition_file trp_file<br />

trp2pow_acc<br />

[-t start_time] [-id start_id] [-o ‘operation_string’] definition_file trp_file


trp2pnt<br />

[-f fac<strong>to</strong>r] trp_file<br />

trp2grid<br />

[-min | -max | -average | -sum ] [-x grid_x_origin] [-y grid_y_origin] [-w grid_width] trp_file<br />

trp_deldouble<br />

[-comment] [-min | -max | -average | -sum ] [-s snap_distance] trp_file


‘pnt2trp’ <strong>and</strong> ‘trp2pnt’<br />

usage: pnt2trp [-sort] [-f fac<strong>to</strong>r] ascii_point_file<br />

usage: trp2pnt [-f fac<strong>to</strong>r] trp_file<br />

The data conversion programs ‘pnt2trp’ <strong>and</strong> ‘trp2pnt’ enable the ‘mis’-use of the ID in ASCII<br />

POINT files (compatible <strong>to</strong> the ARCINFOÒ defined ASCII-Format) for point data exchange<br />

between ArcInfo <strong>and</strong> FEFLOW. The function values of the triplet files can be scaled in<strong>to</strong> the<br />

integer ID <strong>and</strong> vice versa. In pnt2trp the option -sort activates the data point sorting <strong>and</strong> deletion<br />

of multiple points with identical coordinate strings.<br />

‘<strong>debug</strong>_trp’<br />

usage: <strong>debug</strong>_trp [-discard] [-comment] [-c ‘condition_string’] [-o ‘operation_string’] trp_file<br />

‘<strong>debug</strong>_trp’ provides basic spreadsheet functionality on triplet files. Operations will be performed<br />

on all points or - if a condition_string parameter is used - on the selected points.


‘mix_trp’<br />

usage: mix_trp [-discard | -merge] [-comment] [-s snap_distance] [-o ‘operation_string’]<br />

trp_file1 trp_file2<br />

‘mix_trp’ provides basic spreadsheet functionality on triplet files. Operations will be performed on<br />

the points that are contained in two triplet files.<br />

‘mix_trp’ is running a foreward loop along the data points in trp_file1 (reference points). At every<br />

reference points the trp_file2 will be scanned in a foreward loop if a point lies in a circle with the<br />

radius snap_distance around the reference point (corresponding point). The scanning of trp_file2<br />

will be finished if one corrsponding point had been found. All operations <strong>to</strong> calculate the function<br />

value of the output according the -o ‘operation_string’ refer only <strong>to</strong> a pair of reference <strong>and</strong><br />

corresponding point.<br />

Using the -discard-Option all the reference points from trp_file1 without a corresponding point in<br />

trp_file2 do not appear in the program output. The points from trp_file2 that are not a<br />

corresponding point <strong>to</strong> the trp_file1 data do also not come <strong>to</strong> the program output.<br />

Using the -merge-Option all points from trp_file1 <strong>and</strong> trp_file2 that are not in a reference point -<br />

corresponding point pair will be written <strong>to</strong> the program output without any changes.<br />

Using the –comment option the program appends a short comment <strong>to</strong> the output data line that<br />

comes from a found pair of reference point - corresponding point. This comment output may not<br />

be in a FEFLOW import data file. So you should use this option only <strong>to</strong> check the program<br />

output.


‘dar2pow’<br />

usage: dar2pow [-p ‘H’| ‘C’| ‘T’] [-id start_id] dar_file observation_point_id<br />

[observation_point_id ...]<br />

dar2pow extracts from the results file the timesteps <strong>and</strong> values of the defined Parameter (H or C<br />

or T) for the defined obeservation points <strong>and</strong> writem them in<strong>to</strong> a powerfunction file. The start_id<br />

determines the first powerfunction ID written in the output file. This ID will be increased<br />

au<strong>to</strong>matically. The default ID is -999. If FEFLOW finds a negative ID-value then the following<br />

power function will be placed at the next free FEFLOW-time function-ID. If FEFLOW finds a<br />

positive ID in the imported powerfunction file, this ID will be used also in FEFLOW.<br />

‘dar2trp’<br />

usage: dar2trp [-p ‘H’| ‘C’| ‘T’] [-s time_step_number] dar_file<br />

Using dar2trp you can generate triplet files for a specific time step. That’s usefull <strong>to</strong> analyze<br />

differences between the FEFLOW results <strong>and</strong> measured values at the observation points. If no<br />

-s time_step_number is defined the last timestep will be used as default.


‘mix_pnt+trp’<br />

usage: mix_pnt+trp [-s snap_distance] ascii_point_file trp_file<br />

This program is usefull if FEFLOW results or modell property data are <strong>to</strong> be exported in<strong>to</strong><br />

ArcInfo. ArcInfo tables can read the resulting file using the ‘add’-comm<strong>and</strong>.<br />

‘describe_trp’<br />

usage: describe_trp [-c ‘condition_string’] trp_file<br />

describe_trp offers the simplest statistical analysis on triplet files.


‘set_lin_id’<br />

usage: set_lin_id [-id ID-value] ascii_line_file<br />

The color of line background maps in FEFLOW will be computed from the LINE-ID. 4 colors are<br />

used. They are defined using the ‘modulo’-opera<strong>to</strong>r on the ID: color=Line-ID % 4. Use set_lin_id<br />

<strong>to</strong> change the background map color.<br />

‘tin2trp’<br />

usage: tin2trp [-sort] [-f fac<strong>to</strong>r] ascii_tin_file<br />

Data from the ArcInfo TIN module can be exported using the ASCII TIN format. tin2trp <strong>convert</strong>s<br />

these data in<strong>to</strong> the FEFLOW supported format. The Z-coordinate can be scaled <strong>to</strong> compute the<br />

function value in the program output. This is a good basis for FEFLOW’s regionalization <strong>to</strong>ols <strong>to</strong><br />

compute slice elevations or other parameter distribitions.


‘trp2grid’<br />

usage: trp2grid [-min | -max | -average | -sum ] [-x grid_x_origin] [-y grid_y_origin] [-w<br />

grid_width] trp_file<br />

If triplets are <strong>to</strong> be <strong>convert</strong>ed in<strong>to</strong> a quadratic raster grid you can use trp2grid. Further<br />

conversions on the grid file may also be performed using the <strong>debug</strong>_trp program (e.g. from<br />

Row/Column coordinates in<strong>to</strong> the real world coordinates of the cell-center point).


‘trp_deldouble’<br />

usage: trp_deldouble [-comment] [-min | -max | -average | -sum ] [-s snap_distance] trp_file<br />

FEFLOW’s regionalization <strong>to</strong>ols need input data without multiple values at the same location.<br />

This can be achieved using trp_deldouble. Using different snap_distance values you can use the<br />

<strong>to</strong>ol also for data reduction purposes.<br />

‘ply_deldouble’<br />

usage: ply_deldouble [-comment] [-s snap_distance] trp_file<br />

Multiple points in polygons are possible in the ASCII format but FEFLOW <strong>to</strong>ols like the<br />

supermesh filter or the JOIN method await no multiple points. There may occure different errors<br />

<strong>and</strong> effects using multiple points. You should check your polygons using this <strong>to</strong>ols before you<br />

they are used in FEFLOW. Using the -comment-option <strong>and</strong> different snap_distance values you<br />

can get an overview on the point densities in the input file.


‘<strong>debug</strong>_pow’<br />

usage: <strong>debug</strong>_pow [-discard] [-ci ‘ID_condition_string’] [-cp ‘Point_condition_string’] [-o<br />

‘operation_string’] pow_file<br />

<strong>debug</strong>_pow provides basic spreadsheet functionality on FEFLOW powerfunction files.<br />

Operations will be performed on:<br />

- all selected points (selected by -cp ‘Point_condition_string’ , default: all points selected)<br />

within<br />

- all selected functions (selected by -ci ‘ID_condition_string’ , default: all functions selected)<br />

Unselected points or functions will comes <strong>to</strong> output without any changes or - if the -discard<br />

parameter is defined - they will be hidden in the output.<br />

‘describe_pow’<br />

usage: describe_pow [-ci ‘ID_condition_string’] [-cp ‘Point_condition_string’] pow_file<br />

describe_pow analyses the seleted powerfunctions <strong>and</strong> powerfunction points <strong>and</strong> write out some<br />

descripting <strong>and</strong> statistical information.<br />

The ID_condition_string serves <strong>to</strong> select complete powerfunctions from the file. The<br />

Point_condition_string serves <strong>to</strong> select a number of powerfunction points from the<br />

powerfunctions selected by the ID_condition_string.<br />

For all selected powerfunctions <strong>and</strong> their selected powerfunction points the analysis will be<br />

applied.<br />

A powerfunction in the pow_file is accesible by it’s ‘ID’. If no ID will be selected in the<br />

ID_condition_string, all ID’s (resp. all powerfunctions) will be selected. The columns of the<br />

powerfunction are accesible as ‘T’ <strong>and</strong> ‘F’. If no points will be selected using the T <strong>and</strong> F


variables in the Point_condition_string, than all points of the selected powerfunctions are<br />

selected points (default). The condition may be defined by Boolean combinations (!, ||, &&) of<br />

relational expressions build with ==, !=, , =, e.g.:’ID 200.0’<br />

Example output of describe_pow gaug_curves.pow:<br />

‘mix_pow’<br />

usage: mix_pow [-comment] [-discard | -merge] [-o ‘operation_string’]<br />

pow_file1 pow_file2<br />

mix_pow looks for each powerfunction in pow_file1 if there is a corresponding powerfunction<br />

with identical ID in pow_file2. For these common powerfunctions all data points from pow_file1<br />

<strong>and</strong> pow_file2 will be merged <strong>and</strong> written out sorted by time. On the time levels <strong>and</strong> function<br />

values operations can be defined in the operation_string that will be performed before sorting.<br />

In the operation_string the user can access the time level values as variables named T1 <strong>and</strong> T2.<br />

Function values can be accessed by variables F1 <strong>and</strong> F2.<br />

Examples:<br />

‘T2 = T2 + 12000.00’ : Time shift of powerfunction from pow_file2.<br />

‘F1 = log (F1); F2 = log (F2)’ : Write logarithmized values.<br />

Operations can use functions (e.g.: atan2, cos, exp, int, log, r<strong>and</strong>, sin, sqrt, ...see manual pages<br />

for ‘nawk’). If no operation_string is defined no operation will be done.<br />

If the -comment option is used, for every mixed powerfunction a second comment line will be<br />

inserted.<br />

If the -discard option is used, only the powerfunctions common in pow_file1 <strong>and</strong> pow_file2 will be<br />

written <strong>to</strong> st<strong>and</strong>ard output. Otherwise also powerfunctions from pow_file1 without a<br />

corresponding powerfunction in pow_file2 will be written <strong>to</strong> st<strong>and</strong>ard output without any changes.<br />

The-merge option writes all powerfunctions from pow_file1 AND pow_file2. Only for<br />

corresponding powerfunctions operation_string will be applied.


‘mix_pow_id’<br />

usage: mix_pow_id pow_file id_file<br />

mix_pow_id is a usefull <strong>to</strong>ol <strong>to</strong> derive powerfunctions with different offsets from one given<br />

powerfunction. For example you have the time-dependency of a floodwave as input<br />

powerfunction <strong>and</strong> you wish <strong>to</strong> create powerfunctions of this floodwave adapted <strong>to</strong> different<br />

positions of a river with different medium elevations (offsets): Then you put the elevations in<strong>to</strong><br />

the id_file <strong>and</strong> the powerfunction in<strong>to</strong> the pow_file - <strong>and</strong> as a result you get the shifted<br />

powerfunctions for your different offset values.<br />

mix_pow_id generates for each entry in id_file an power function with the ID from the id_file, with<br />

time values from the pow_file <strong>and</strong> function values computed as the sum of the value in the id_file<br />

<strong>and</strong> the values in the pow_file.<br />

The pow_file must contain only 1 powerfunction. Comment lines at the header of the pow_file<br />

can begin with ‘!’ or ‘#’.<br />

In the id_file all lines with two entries will be used as a data line. Comment lines, empty lines or<br />

the END-line are allowed if they have less or more than 2 entries.<br />

Example:


‘mix_pow_tab’<br />

usage: mix_pow_tab pow_file tab_file<br />

Using mix_pow_tab you can <strong>convert</strong> your time-dependent data from database format (with<br />

different columns for different timestages) in<strong>to</strong> FEFLOW powerfunction file format with data pairs<br />

.<br />

mix_pow_tab generates for each data line in tab_file an power function with the ID from the 1 st<br />

column in tab_file, with time values from the pow_file <strong>and</strong> function values from the columns 2 pp.<br />

in the tab_file.<br />

The pow_file must contain only 1 column with timelevel values. Comment lines at the header of<br />

the pow_file can begin with ‘!’ or ‘#’.<br />

In the tab_file the first line contains columns names. It will be skipped. The columns are: 1 st<br />

column: ID column. Following columns: Data values for the timelevels defined in the pow_file.<br />

Example:<br />

‘trp2pow’<br />

usage: trp2pow [-id start_id] [-o ‘operation_string’] [-s snap_distance]<br />

definition_file trp_file<br />

This program is usefull <strong>to</strong> generate output in FEFLOW’s powerfunction file format from point<br />

measurement data files at different measure times. The definition_file defines in a list the used<br />

time values <strong>and</strong> for every time value the appropriate triplet file in the following format:<br />

! Comment lines beginning with ‘!’ are possible.<br />

time1 trp_file_at_time1<br />

time2 trp_file_at_time2<br />

timeX trp_file_at_timeX<br />

END<br />

trp2pow works on all points that are contained in the trp_file. This file serves <strong>to</strong> select points<br />

where further power function generation will be performed.


This program produces powerfunction output. For every data point that occures in the input<br />

trp_file a powerfunction will be created. Therefore for every point runs a loop over all the triplet<br />

files that are defined in the definition_file. If the point finds in such a time data file then a output<br />

line of the format will be generated. T is the time value according <strong>to</strong> column 1 in the<br />

definition_file. F is a function value that will be computed according the operation_string.<br />

The operation_string defines the function value in the program output. It can define a<br />

computation using the function value from the input trp_file that is represented as the characters<br />

‘FR’ –reference value- in the operation_string. The function value of the current time data triplet<br />

file will be represented as the characters ‘FT’ -time value- in the operation_string. The resulting<br />

value for the program output will be represented as the character ‘F’ in the operation_string.<br />

Example1: Resulting value is the reference value plus the time dependend values: ‘F = FR + FT’<br />

The reference data will be used as offset for the time dependent data.<br />

Example2: Resulting value is the time dependent value shifted <strong>and</strong> scaled: ‘F = (FT + 0.36) *<br />

1000.0’<br />

The default value of operation_string is ‘F = FT’. So the reference data will not be used <strong>and</strong> the<br />

time dependent data will be used without any changes.<br />

The -id start_id option defines the first power function ID. The default value is -999. The power<br />

function ID will be increased au<strong>to</strong>matically. The -s snap_distance option defines the radius<br />

around reference points <strong>to</strong> identify ‘identical’ points in the time dependent triplet files. The default<br />

value is 1.e-6.<br />

Example:<br />

‘trp2pow_acc’<br />

usage: trp2pow_acc [-t start_time] [-id start_id] [-o ‘operation_string’] definition_file trp_file<br />

trp2pow_acc is very similar <strong>to</strong> trp2pow. The difference consistes of two points:<br />

The time levels in the definition file <strong>and</strong> the function values defined in the files<br />

trp_file_at_time_xxx will be interpreted as time step lenght <strong>and</strong> as value change at this<br />

timestep. (In trp2acc these data will treated as absolute time levels <strong>and</strong> absolute physical<br />

values at the time.)<br />

The initial time comes from the input parameter -t start_time <strong>and</strong> the initial value comes from<br />

the input trp_file.<br />

The resulting powerfunction data will be computed in the following way:


T is the accumulated time value. It will computed starting from the start_time <strong>and</strong> will be<br />

increased by every time value in the definition_file:<br />

T = start_time + time1 + time2 + ... + timeQ (for Q-th power function entry)<br />

F is a accumulated function value. It starts with the reference value from the input trp_file. For<br />

every timestep it will be increased (plus-operation) by the current timestepvalue that will be<br />

compzted according the operation_string:<br />

F = reference_value + F1 + F2 + ... + FQ (for Q-th power function entry)<br />

Example:<br />

‘opera_trp’<br />

Usage: opera_trp [-o ‘operation_string’] trp_file1 trp_file2<br />

opera_trp is usefull <strong>to</strong> perform operations (e.g. difference computation) on point data exported<br />

from the FEFLOW mesh. These data will be written always in the same order (the internal<br />

indexing, visible in the mesh inspec<strong>to</strong>r).<br />

opera_trp works on 2 trp-files with the same points (identical coordinates but arbitrary function<br />

values) in the same order. It takes two data lines from file 1 <strong>and</strong> 2 <strong>and</strong> writes out the coordinates<br />

from file 1 <strong>to</strong>gether with the value F computed by the operation string.<br />

The operation_string can define a computation using the function value from trp_file1<br />

represented as ‘F1’) or from the first corresponding point in trp_file2, (represented as the<br />

character ‘F2’.)<br />

- Operation examples with a constant value, e.g.:<br />

‘F = F1+10.0’ ‘F = (F1+F2)/2.0’ ‘F = 2.0*F1 + F2’<br />

- Operation examples with functions (atan2, cos, exp, int, log, r<strong>and</strong>, sin, sqrt,<br />

...see manual pages for ‘nawk’), e.g.:<br />

‘F = log (F1-F2)’ ‘F = 2.0 * sin (F1 - 3.1415)’ ‘F = sqrt(F1*F2)’<br />

The default value of operation_string is ‘F = F1’, which is identical with no change of the value of<br />

the reference point.


‘grid2trp’<br />

usage: grid2trp grid_file<br />

grid2trp <strong>convert</strong>s an grid file in the ESRI grid file format (ASCII) in<strong>to</strong> FEFLOW triplet formatted<br />

output [x y f]. The ‘nodata-points’ will be hidden in the output. All other data points in the grid will<br />

the written out with point coordinates at the centerpoints of the grid cells. The program is easy <strong>to</strong><br />

adapt <strong>to</strong> other grid file formats.<br />

Example:


Data Conversion <strong>Tools</strong><br />

<strong>Tools</strong> <strong>to</strong> <strong>convert</strong> data between ASCII-files <strong>and</strong> ESRI-shape files + AUTOCAD DXF<br />

files<br />

1. Introduction<br />

The Data Conversion <strong>Tools</strong> are programs written in C reading a file without changing it. If you<br />

don’t specify a output filename using the ‘-o’-option the output filename will be generated<br />

au<strong>to</strong>matically from the input filename or the output is directed <strong>to</strong> stdout. The data conversion<br />

<strong>to</strong>ols can be found in the /bin direc<strong>to</strong>ry. Typing the name of the conversion <strong>to</strong>ol<br />

without a specified input file, a help text is displayed explaining the usage of the <strong>to</strong>ol.<br />

2. Converting ASCII-files <strong>and</strong> ESRI-Shapefiles<br />

Source Tool Destination<br />

ESRI shape file (polygon) shp<strong>to</strong>asc ASCII polygon file (*.ply)<br />

ESRI shape file (line) shp<strong>to</strong>asc ASCII line file (*.lin)<br />

ESRI shape file (point) shp<strong>to</strong>asc ASCII point file (*.pnt)<br />

ESRI shape file (point) shp<strong>to</strong>trp Triplet file (*.trp)<br />

ASCII polygon file (*.ply) ply<strong>to</strong>shp ESRI shape file (polygon)<br />

ASCII line file (*.lin) lin<strong>to</strong>shp ESRI shape file (line)<br />

ASCII point file (*.pnt) pnt<strong>to</strong>shp ESRI shape file (point)<br />

Triplet file (*.trp) trp<strong>to</strong>shp ESRI shape file (point)<br />

Au<strong>to</strong>CAD exchange file (*.dxf)<br />

(NOTE: not compliant with recent<br />

*.dxf file format)<br />

dxf<strong>to</strong>asc ASCII polygon file (*.ply)<br />

ASCII line file (*.lin)<br />

ASCII point file (*.pnt)<br />

ASCII text file (*.ano)<br />

ASCII file (*.plx, *.lin, *.pnt, *.ano) asc<strong>to</strong>dxf Au<strong>to</strong>CAD exchange file (*.dxf)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!