--- MITgcm/doc/README 1998/04/24 03:55:30 1.3 +++ MITgcm/doc/README 1998/10/28 03:23:17 1.14 @@ -1,9 +1,123 @@ -$Header: /home/ubuntu/mnt/e9_copy/MITgcm/doc/README,v 1.3 1998/04/24 03:55:30 cnh Exp $ +$Header: /home/ubuntu/mnt/e9_copy/MITgcm/doc/README,v 1.14 1998/10/28 03:23:17 cnh Exp $ + MITgcmUV Getting Started ======================== o Introduction -o References + This note is a guide to using the MIT General Circulation Model Ultra Verstaile + implementation, MITgmcUV. MITgcmUV is a Fortran code that implements the + algorithm described in Marshall et. al. 1997, Hill, Adcroft, ... + The MITgcmUV implementation is designed to work efficiently on all classes + of computer platforms. It can be used in both a single processor mode + and a parallel processor mode. Parallel processing can be either multi-threaded + shared memory such as that found on CRAY T90 machines or it can be multi-process + distributed memory. A set of "execution enviroment" support routines are + used to allow the same numerical code to run on top of a single-process, multi-threaded + or distributed multi-process configuration. + +o Installing + To setup the model on a particular computer the code tree must be created + and appropriate compile and run scripts set up. For some platforms + the necessary scripts are included in the release - in this case follow + the steps below: + + 1. Extract MITgcmUV from the downloadable archive + tar -xvf cNN.tar + + 2. Create platform specific make file + For example on a Digital UNIX machine the script "genmake.dec" can + be used as shown below + + cd bin + ../tools/genmake + cp Makefile.alpha Makefile ( On Alpha machine) + + 3. Now create header file dependency entries + make depend + + 4. Compile code + make + + 5. Copy input files + cp ../verification/exp2/[a-z]* ../verification/exp2/*bin . + + 6. Run baseline test case + setenv PARALLEL 1 + dmpirun -np 2 ../exe/mitgcmuv ( Under Digital UNIX ) + mpirun.p4shmem ../exe/mitgcmuv -np 2 ( Under Solaris + mpich) + + + This runs a 4 degree global ocean climatological simulation. + By default this code is set to use two processors splitting + the model domain along the equator. Textual output is written + to files STDOUT.* and STDERR.* with one file for each process. + Model fileds are written to files suffixed .data and .meta + These files are written on a per process basis. The .meta + file indicates the location and shape of the subdomain in + each .data file. + + This 4 degree global ocean climatological simulation is the baseline + configuration for the MITgcmUV code. The change files that + convert the model to a different configuration in the + verification directory all assume that the model is configured for + the baseline case and change the model code accordingly. + +o Running + - Input and output files + + Required files + ============== + The model is configured to look for two files with fixed names. + These files are called + "eedata" and "data". + The file eedata contains "execution environment" data. At present + this consists of a specification of the number of threads to + use in X and Y under multithreaded execution. + + - Serial execution + + - Parallel execution. Threads + nSx, nSy + setenv PARALLEL n + nTx=2, nTy=2 + + - Parallel execution. MPI + mPx, nPy + dmpirun + + - Parallel execution. Hybrid + +o Cutomising the code + + Model structure + =============== + The "numerical" model is contained within a execution + environment support wrapper. This wrapper is designed + to provide a general framework for grid-point models. + MITgcmUV is a specific numerical model that uses the + framework. + Under this structure the model is split into + execution environment support code and conventional + numerical model code. The execution environment + support code is held under the "eesupp" directory. + The grid point model code is held under the + "model" directory. + Code execution actually starts in the eesupp + routines and not in the model routines. For this + reason the top level main.F is in the eesupp/src + directory. End-users should not need to worry about + this level. The top-level routine for the numerical + part of the code is in model/src/the_model_main.F. + + +o References + Web sites - HP + for doc Digital + SGI + Sun + Linux threads + CRAY multitasking + PPT notes