1 |
$Header: /u/gcmpack/models/MITgcmUV/doc/README,v 1.12 1998/07/16 16:58:32 cnh Exp $ |
2 |
|
3 |
|
4 |
MITgcmUV Getting Started |
5 |
======================== |
6 |
|
7 |
o Introduction |
8 |
|
9 |
This note is a guide to using the MIT General Circulation Model Ultra Verstaile |
10 |
implementation, MITgmcUV. MITgcmUV is a Fortran code that implements the |
11 |
algorithm described in Marshall et. al. 1997, Hill, Adcroft, ... |
12 |
The MITgcmUV implementation is designed to work efficiently on all classes |
13 |
of computer platforms. It can be used in both a single processor mode |
14 |
and a parallel processor mode. Parallel processing can be either multi-threaded |
15 |
shared memory such as that found on CRAY T90 machines or it can be multi-process |
16 |
distributed memory. A set of "execution enviroment" support routines are |
17 |
used to allow the same numerical code to run on top of a single-process, multi-threaded |
18 |
or distributed multi-process configuration. |
19 |
|
20 |
o Installing |
21 |
To setup the model on a particular computer the code tree must be created |
22 |
and appropriate compile and run scripts set up. For some platforms |
23 |
the necessary scripts are included in the release - in this case follow |
24 |
the steps below: |
25 |
|
26 |
1. Extract MITgcmUV from the downloadable archive |
27 |
tar -xvf checkpoint12.tar |
28 |
|
29 |
2. Create platform specific make file |
30 |
For example on a Digital UNIX machine the script "genmake.dec" can |
31 |
be used as shown below |
32 |
|
33 |
cd bin |
34 |
../tools/genmake |
35 |
cp Makefile.alpha Makefile ( On Alpha machine) |
36 |
|
37 |
3. Now create header file dependency entries |
38 |
make depend |
39 |
|
40 |
4. Compile code |
41 |
make |
42 |
|
43 |
5. Copy input files |
44 |
cp ../verification/exp2/[a-z]* ../verification/exp2/*bin . |
45 |
|
46 |
6. Run baseline test case |
47 |
setenv PARALLEL 1 |
48 |
dmpirun -np 2 ../exe/mitgcmuv ( Under Digital UNIX ) |
49 |
mpirun.p4shmem ../exe/mitgcmuv -np 2 ( Under Solaris + mpich) |
50 |
|
51 |
|
52 |
This runs a 4 degree global ocean climatological simulation. |
53 |
By default this code is set to use two processors splitting |
54 |
the model domain along the equator. Textual output is written |
55 |
to files STDOUT.* and STDERR.* with one file for each process. |
56 |
Model fileds are written to files suffixed .data and .meta |
57 |
These files are written on a per process basis. The .meta |
58 |
file indicates the location and shape of the subdomain in |
59 |
each .data file. |
60 |
|
61 |
This 4 degree global ocean climatological simulation is the baseline |
62 |
configuration for the MITgcmUV code. The change files that |
63 |
convert the model to a different configuration in the |
64 |
verification directory all assume that the model is configured for |
65 |
the baseline case and change the model code accordingly. |
66 |
|
67 |
o Running |
68 |
|
69 |
- Input and output files |
70 |
|
71 |
Required files |
72 |
============== |
73 |
The model is configured to look for two files with fixed names. |
74 |
These files are called |
75 |
"eedata" and "data". |
76 |
The file eedata contains "execution environment" data. At present |
77 |
this consists of a specification of the number of threads to |
78 |
use in X and Y under multithreaded execution. |
79 |
|
80 |
- Serial execution |
81 |
|
82 |
- Parallel execution. Threads |
83 |
nSx, nSy |
84 |
setenv PARALLEL n |
85 |
nTx=2, nTy=2 |
86 |
|
87 |
- Parallel execution. MPI |
88 |
mPx, nPy |
89 |
dmpirun |
90 |
|
91 |
- Parallel execution. Hybrid |
92 |
|
93 |
o Cutomising the code |
94 |
|
95 |
Model structure |
96 |
=============== |
97 |
The "numerical" model is contained within a execution |
98 |
environment support wrapper. This wrapper is designed |
99 |
to provide a general framework for grid-point models. |
100 |
MITgcmUV is a specific numerical model that uses the |
101 |
framework. |
102 |
Under this structure the model is split into |
103 |
execution environment support code and conventional |
104 |
numerical model code. The execution environment |
105 |
support code is held under the "eesupp" directory. |
106 |
The grid point model code is held under the |
107 |
"model" directory. |
108 |
Code execution actually starts in the eesupp |
109 |
routines and not in the model routines. For this |
110 |
reason the top level main.F is in the eesupp/src |
111 |
directory. End-users should not need to worry about |
112 |
this level. The top-level routine for the numerical |
113 |
part of the code is in model/src/the_model_main.F. |
114 |
|
115 |
|
116 |
o References |
117 |
Web sites - HP |
118 |
for doc Digital |
119 |
SGI |
120 |
Sun |
121 |
Linux threads |
122 |
CRAY multitasking |
123 |
PPT notes |