--- manual/s_getstarted/text/getting_started.tex 2004/03/24 20:53:12 1.22
+++ manual/s_getstarted/text/getting_started.tex 2004/04/08 02:24:23 1.23
@@ -1,4 +1,4 @@
-% $Header: /home/ubuntu/mnt/e9_copy/manual/s_getstarted/text/getting_started.tex,v 1.22 2004/03/24 20:53:12 edhill Exp $
+% $Header: /home/ubuntu/mnt/e9_copy/manual/s_getstarted/text/getting_started.tex,v 1.23 2004/04/08 02:24:23 edhill Exp $
% $Name: $
%\section{Getting started}
@@ -666,7 +666,6 @@
\end{verbatim}
-
\subsection{Using \textit{genmake2}}
\label{sect:genmake}
@@ -828,6 +827,10 @@
``-standarddirs'' option)
\end{itemize}
+\item[\texttt{--mpi}] This option enables certain MPI features (using
+ CPP \texttt{\#define}s) within the code and is necessary for MPI
+ builds (see Section \ref{sect:mpi-build}).
+
\item[\texttt{--make=/path/to/gmake}] Due to the poor handling of
soft-links and other bugs common with the \texttt{make} versions
provided by commercial Unix vendors, GNU \texttt{make} (sometimes
@@ -840,7 +843,7 @@
a Bourne, POSIX, or compatible) shell. The syntax in these
circumstances is:
\begin{center}
- \texttt{/bin/sh genmake2 -bash=/bin/sh [...options...]}
+ \texttt{\% /bin/sh genmake2 -bash=/bin/sh [...options...]}
\end{center}
where \texttt{/bin/sh} can be replaced with the full path and name
of the desired shell.
@@ -848,13 +851,85 @@
\end{description}
+\subsection{Building with MPI}
+\label{sect:mpi-build}
+
+Building MITgcm to use MPI libraries can be complicated due to the
+variety of different MPI implementations available, their dependencies
+or interactions with different compilers, and their often ad-hoc
+locations within file systems. For these reasons, its generally a
+good idea to start by finding and reading the documentation for your
+machine(s) and, if necessary, seeking help from your local systems
+administrator.
+
+The steps for building MITgcm with MPI support are:
+\begin{enumerate}
+
+\item Determine the locations of your MPI-enabled compiler and/or MPI
+ libraries and put them into an options file as described in Section
+ \ref{sect:genmake}. One can start with one of the examples in:
+ \begin{rawhtml}
+ \end{rawhtml}
+ \begin{center}
+ \texttt{MITgcm/tools/build\_options/}
+ \end{center}
+ \begin{rawhtml} \end{rawhtml}
+ such as \texttt{linux\_ia32\_g77+mpi\_cg01} or
+ \texttt{linux\_ia64\_efc+mpi} and then edit it to suit the machine at
+ hand. You may need help from your user guide or local systems
+ administrator to determine the exact location of the MPI libraries.
+ If libraries are not installed, MPI implementations and related
+ tools are available including:
+ \begin{itemize}
+ \item \begin{rawhtml}
+ \end{rawhtml}
+ MPICH
+ \begin{rawhtml} \end{rawhtml}
+
+ \item \begin{rawhtml}
+ \end{rawhtml}
+ LAM/MPI
+ \begin{rawhtml} \end{rawhtml}
+
+ \item \begin{rawhtml}
+ \end{rawhtml}
+ MPIexec
+ \begin{rawhtml} \end{rawhtml}
+ \end{itemize}
+
+\item Build the code with the \texttt{genmake2} \texttt{-mpi} option
+ (see Section \ref{sect:genmake}) using commands such as:
+{\footnotesize \begin{verbatim}
+ % ../../../tools/genmake2 -mods=../code -mpi -of=YOUR_OPTFILE
+ % make depend
+ % make
+\end{verbatim} }
+
+\item Run the code with the appropriate MPI ``run'' or ``exec''
+ program provided with your particular implementation of MPI.
+ Typical MPI packages such as MPICH will use something like:
+\begin{verbatim}
+ % mpirun -np 4 -machinefile mf ./mitgcmuv
+\end{verbatim}
+ Sightly more complicated scripts may be needed for many machines
+ since execution of the code may be controlled by both the MPI
+ library and a job scheduling and queueing system such as PBS,
+ LoadLeveller, Condor, or any of a number of similar tools.
+
+\end{enumerate}
+
+
\section{Running the model}
\label{sect:runModel}
-If compilation finished succesfuully (section \ref{sect:buildModel})
-then an executable called {\em mitgcmuv} will now exist in the local
-directory.
+If compilation finished succesfuully (section \ref{sect:buildingCode})
+then an executable called \texttt{mitgcmuv} will now exist in the
+local directory.
To run the model as a single process (ie. not in parallel) simply
type: