--- manual/s_phys_pkgs/text/exch2.tex 2004/03/19 21:25:45 1.17 +++ manual/s_phys_pkgs/text/exch2.tex 2010/08/30 23:09:21 1.29 @@ -1,4 +1,4 @@ -% $Header: /home/ubuntu/mnt/e9_copy/manual/s_phys_pkgs/text/exch2.tex,v 1.17 2004/03/19 21:25:45 afe Exp $ +% $Header: /home/ubuntu/mnt/e9_copy/manual/s_phys_pkgs/text/exch2.tex,v 1.29 2010/08/30 23:09:21 jmc Exp $ % $Name: $ %% * Introduction @@ -10,11 +10,11 @@ %% o automatically inserted at \section{Reference} -\section{exch2: Extended Cubed Sphere \mbox{Topology}} +\subsection{exch2: Extended Cubed Sphere \mbox{Topology}} \label{sec:exch2} -\subsection{Introduction} +\subsubsection{Introduction} The \texttt{exch2} package extends the original cubed sphere topology configuration to allow more flexible domain decomposition and @@ -23,7 +23,7 @@ dimensions of the subdomain. Furthermore, the tiles can run on separate processors individually or in groups, which provides for manual compile-time load balancing across a relatively arbitrary -number of processors. \\ +number of processors. The exchange parameters are declared in \filelink{pkg/exch2/W2\_EXCH2\_TOPOLOGY.h}{pkg-exch2-W2_EXCH2_TOPOLOGY.h} @@ -32,7 +32,7 @@ validity of the cube topology depends on the \file{SIZE.h} file as detailed below. The default files provided in the release configure a cubed sphere topology of six tiles, one per subdomain, each with -32$\times$32 grid points, all running on a single processor. Both +32$\times$32 grid points, with all tiles running on a single processor. Both files are generated by Matlab scripts in \file{utils/exch2/matlab-topology-generator}; see Section \ref{sec:topogen} \sectiontitle{Generating Topology Files for exch2} @@ -41,46 +41,49 @@ \file{utils/exch2/code-mods} along with the appropriate \file{SIZE.h} file for single-processor execution. -\subsection{Invoking exch2} +\subsubsection{Invoking exch2} To use exch2 with the cubed sphere, the following conditions must be -met: \\ +met: -$\bullet$ The exch2 package is included when \file{genmake2} is run. - The easiest way to do this is to add the line \code{exch2} to the - \file{profile.conf} file -- see Section - \ref{sect:buildingCode} \sectiontitle{Building the code} for general - details. \\ +\begin{itemize} +\item The exch2 package is included when \file{genmake2} is run. The + easiest way to do this is to add the line \code{exch2} to the + \file{packages.conf} file -- see Section \ref{sec:buildingCode} + \sectiontitle{Building the code} for general + details. -$\bullet$ An example of \file{W2\_EXCH2\_TOPOLOGY.h} and +\item An example of \file{W2\_EXCH2\_TOPOLOGY.h} and \file{w2\_e2setup.F} must reside in a directory containing files - symbolically linked when \file{genmake2} runs. The safest place to - put these is the directory indicated in the \code{-mods=DIR} command - line modifier (typically \file{../code}), or the build directory. - The default versions of these files reside in \file{pkg/exch2} and - are linked automatically if no other versions exist elsewhere in the - build path, but they should be left untouched to avoid breaking - configurations other than the one you intend to modify.\\ - -$\bullet$ Files containing grid parameters, named - \file{tile00$n$.mitgrid} where $n$=\code{(1:6)} (one per subdomain), - must be in the working directory when the MITgcm executable is run. - These files are provided in the example experiments for cubed sphere - configurations with 32$\times$32 cube sides and are non-trivial to - generate -- please contact MITgcm support if you want to generate - files for other configurations. \\ - -$\bullet$ As always when compiling MITgcm, the file \file{SIZE.h} must - be placed where \file{genmake2} will find it. In particular for - exch2, the domain decomposition specified in \file{SIZE.h} must - correspond with the particular configuration's topology specified in + symbolically linked by the \file{genmake2} script. The safest place + to put these is the directory indicated in the \code{-mods=DIR} + command line modifier (typically \file{../code}), or the build + directory. The default versions of these files reside in + \file{pkg/exch2} and are linked automatically if no other versions + exist elsewhere in the build path, but they should be left untouched + to avoid breaking configurations other than the one you intend to + modify. + +\item Files containing grid parameters, named \file{tile00$n$.mitgrid} + where $n$=\code{(1:6)} (one per subdomain), must be in the working + directory when the MITgcm executable is run. These files are + provided in the example experiments for cubed sphere configurations + with 32$\times$32 cube sides -- please contact MITgcm support if you + want to generate files for other configurations. + +\item As always when compiling MITgcm, the file \file{SIZE.h} must be + placed where \file{genmake2} will find it. In particular for exch2, + the domain decomposition specified in \file{SIZE.h} must correspond + with the particular configuration's topology specified in \file{W2\_EXCH2\_TOPOLOGY.h} and \file{w2\_e2setup.F}. Domain decomposition issues particular to exch2 are addressed in Section \ref{sec:topogen} \sectiontitle{Generating Topology Files for exch2} - and \ref{sec:exch2mpi} \sectiontitle{exch2, SIZE.h, and MPI}; a more - general background on the subject relevant to MITgcm is presented in - Section \ref{sect:specifying_a_decomposition} - \sectiontitle{Specifying a decomposition}.\\ + and \ref{sec:exch2mpi} \sectiontitle{exch2, SIZE.h, and + Multiprocessing}; a more general background on the subject + relevant to MITgcm is presented in Section + \ref{sec:specifying_a_decomposition} + \sectiontitle{Specifying a decomposition}. +\end{itemize} At the time of this writing the following examples use exch2 and may be used for guidance: @@ -96,7 +99,7 @@ -\subsection{Generating Topology Files for exch2} +\subsubsection{Generating Topology Files for exch2} \label{sec:topogen} Alternate cubed sphere topologies may be created using the Matlab @@ -106,59 +109,71 @@ from the Matlab prompt (there are no parameters to pass) generates exch2 topology files \file{W2\_EXCH2\_TOPOLOGY.h} and \file{w2\_e2setup.F} in the working directory and displays a figure of -the topology via Matlab. The other m-files in the directory are -subroutines of \file{driver.m} and should not be run ``bare'' except +the topology via Matlab -- figures \ref{fig:6tile}, \ref{fig:18tile}, +and \ref{fig:48tile} are examples of the generated diagrams. The other +m-files in the directory are +subroutines called from \file{driver.m} and should not be run ``bare'' except for development purposes. \\ The parameters that determine the dimensions and topology of the generated configuration are \code{nr}, \code{nb}, \code{ng}, \code{tnx} and \code{tny}, and all are assigned early in the script. \\ -The first three determine the size of the subdomains and +The first three determine the height and width of the subdomains and hence the size of the overall domain. Each one determines the number of grid points, and therefore the resolution, along the subdomain -sides in a ``great circle'' around an axis of the cube. At the time +sides in a ``great circle'' around each the three spatial axes of the cube. At the time of this writing MITgcm requires these three parameters to be equal, but they provide for future releases to accomodate different -resolutions around the axes to allow (for example) greater resolution -around the equator.\\ +resolutions around the axes to allow subdomains with differing resolutions.\\ -The parameters \code{tnx} and \code{tny} determine the dimensions of +The parameters \code{tnx} and \code{tny} determine the width and height of the tiles into which the subdomains are decomposed, and must evenly divide the integer assigned to \code{nr}, \code{nb} and \code{ng}. The result is a rectangular tiling of the subdomain. Figure -\ref{fig:24tile} shows one possible topology for a twenty-four-tile -cube, and figure \ref{fig:12tile} shows one for twelve tiles. \\ +\ref{fig:48tile} shows one possible topology for a twenty-four-tile +cube, and figure \ref{fig:6tile} shows one for six tiles. \\ \begin{figure} \begin{center} - \resizebox{4in}{!}{ - \includegraphics{part6/s24t_16x16.ps} + \resizebox{6in}{!}{ +% \includegraphics{s_phys_pkgs/figs/s24t_16x16.ps} + \includegraphics{s_phys_pkgs/figs/adjust_cs.ps} } \end{center} \caption{Plot of a cubed sphere topology with a 32$\times$192 domain divided into six 32$\times$32 subdomains, each of which is divided -into four tiles (\code{tnx=16, tny=16}) for a total of twenty-four -tiles. } \label{fig:24tile} +into eight tiles of width \code{tnx=16} and height \code{tny=8} for a +total of forty-eight tiles. The colored borders of the subdomains +represent the parameters \code{nr} (red), \code{ng} (green), and +\code{nb} (blue). +This tiling is used in the example +verification/adjustment.cs-32x32x1/ +with the option (blanklist.txt) to remove the land-only 4 tiles +(11,12,13,14) which are filled in red on the plot. +} \label{fig:48tile} \end{figure} \begin{figure} \begin{center} - \resizebox{4in}{!}{ - \includegraphics{part6/s12t_16x32.ps} + \resizebox{6in}{!}{ +% \includegraphics{s_phys_pkgs/figs/s12t_16x32.ps} + \includegraphics{s_phys_pkgs/figs/polarcap.ps} } \end{center} -\caption{Plot of a cubed sphere topology with a 32$\times$192 domain -divided into six 32$\times$32 subdomains of two tiles each - (\code{tnx=16, tny=32}). -} \label{fig:12tile} +\caption{Plot of a non-square cubed sphere topology with +6 subdomains of different size (nr=90,ng=360,nb=90), +divided into one to four tiles each + (\code{tnx=90, tny=90}), resulting in a total of 18 tiles. +} \label{fig:18tile} \end{figure} \begin{figure} \begin{center} \resizebox{4in}{!}{ - \includegraphics{part6/s6t_32x32.ps} +% \includegraphics{s_phys_pkgs/figs/s6t_32x32.ps} + \includegraphics{s_phys_pkgs/figs/s6t_32x32.ps} } \end{center} \caption{Plot of a cubed sphere topology with a 32$\times$192 domain @@ -179,54 +194,63 @@ -\subsection{exch2, SIZE.h, and multiprocessing} +\subsubsection{exch2, SIZE.h, and Multiprocessing} \label{sec:exch2mpi} Once the topology configuration files are created, the Fortran \code{PARAMETER}s in \file{SIZE.h} must be configured to match. -Section \ref{sect:specifying_a_decomposition} \sectiontitle{Specifying -a decomposition} provides a general description of domain +Section \ref{sec:specifying_a_decomposition} \sectiontitle{Specifying + a decomposition} provides a general description of domain decomposition within MITgcm and its relation to \file{SIZE.h}. The -current section specifies certain constraints the exch2 package -imposes as well as describes how to enable parallel execution with -MPI. \\ +current section specifies constraints that the exch2 package imposes +and describes how to enable parallel execution with MPI. As in the general case, the parameters \varlink{sNx}{sNx} and \varlink{sNy}{sNy} define the size of the individual tiles, and so must be assigned the same respective values as \code{tnx} and -\code{tny} in \file{driver.m}.\\ +\code{tny} in \file{driver.m}. The halo width parameters \varlink{OLx}{OLx} and \varlink{OLy}{OLy} have no special bearing on exch2 and may be assigned as in the general -case. The same holds for \varlink{Nr}{Nr}, the number of vertical -levels in the model.\\ +case. The same holds for \varlink{Nr}{Nr}, the number of vertical +levels in the model. The parameters \varlink{nSx}{nSx}, \varlink{nSy}{nSy}, \varlink{nPx}{nPx}, and \varlink{nPy}{nPy} relate to the number of tiles and how they are distributed on processors. When using exch2, -the tiles are stored in a single dimension, and so +the tiles are stored in the $x$ dimension, and so \code{\varlink{nSy}{nSy}=1} in all cases. Since the tiles as configured by exch2 cannot be split up accross processors without -regenerating the topology, \code{\varlink{nPy}{nPy}=1} as well. \\ +regenerating the topology, \code{\varlink{nPy}{nPy}=1} as well. The number of tiles MITgcm allocates and how they are distributed between processors depends on \varlink{nPx}{nPx} and \varlink{nSx}{nSx}. \varlink{nSx}{nSx} is the number of tiles per -processor and \varlink{nPx}{nPx} the number of processors. The total -number of tiles in the topology minus those listed in -\file{blanklist.txt} must equal \code{nSx*nPx}. \\ - -The following is an example of \file{SIZE.h} for the twelve-tile -configuration illustrated in figure \ref{fig:12tile} running on -one processor: \\ +processor and \varlink{nPx}{nPx} is the number of processors. The +total number of tiles in the topology minus those listed in +\file{blanklist.txt} must equal \code{nSx*nPx}. Note that in order to +obtain maximum usage from a given number of processors in some cases, +this restriction might entail sharing a processor with a tile that +would otherwise be excluded because it is topographically outside of +the domain and therefore in \file{blanklist.txt}. For example, +suppose you have five processors and a domain decomposition of +thirty-six tiles that allows you to exclude seven tiles. To evenly +distribute the remaining twenty-nine tiles among five processors, you +would have to run one ``dummy'' tile to make an even six tiles per +processor. Such dummy tiles are \emph{not} listed in +\file{blanklist.txt}. + +The following is an example of \file{SIZE.h} for the six-tile +configuration illustrated in figure \ref{fig:6tile} +running on one processor: \begin{verbatim} PARAMETER ( - & sNx = 16, + & sNx = 32, & sNy = 32, & OLx = 2, & OLy = 2, - & nSx = 12, + & nSx = 6, & nSy = 1, & nPx = 1, & nPy = 1, @@ -235,16 +259,16 @@ & Nr = 5) \end{verbatim} -The following is an example for the twenty-four-tile topology in -figure \ref{fig:24tile} running on six processors: +The following is an example for the forty-eight-tile topology in +figure \ref{fig:48tile} running on six processors: \begin{verbatim} PARAMETER ( & sNx = 16, - & sNy = 16, + & sNy = 8, & OLx = 2, & OLy = 2, - & nSx = 4, + & nSx = 8, & nSy = 1, & nPx = 6, & nPy = 1, @@ -254,10 +278,7 @@ \end{verbatim} - - - -\subsection{Key Variables} +\subsubsection{Key Variables} The descriptions of the variables are divided up into scalars, one-dimensional arrays indexed to the tile number, and two and @@ -267,7 +288,7 @@ arrays to individual tiles, and the arrays indexed by tile and neighbor to relationships between tiles and their neighbors. \\ -\subsubsection{Scalars} +Scalars: The number of tiles in a particular topology is set with the parameter \code{NTILES}, and the maximum number of neighbors of any tiles by @@ -281,16 +302,16 @@ of tiles in the $x$ and $y$ global indices. For example, the default setup of six tiles (Fig. \ref{fig:6tile}) has \code{exch2\_domain\_nxt=6} and \code{exch2\_domain\_nyt=1}. A -topology of twenty-four square tiles, four per subdomain (as in figure -\ref{fig:24tile}), will have \code{exch2\_domain\_nxt=12} and -\code{exch2\_domain\_nyt=2}. Note that these parameters express the -tile layout to allow global data files that are tile-layout-neutral -and have no bearing on the internal storage of the arrays. The tiles -are stored internally in a range from \code{(1:\varlink{bi}{bi})} the -$x$ axis, and the $y$ axis variable \varlink{bj}{bj} generally is -ignored within the package. \\ +topology of forty-eight tiles, eight per subdomain (as in figure +\ref{fig:48tile}), will have \code{exch2\_domain\_nxt=12} and +\code{exch2\_domain\_nyt=4}. Note that these parameters express the +tile layout in order to allow global data files that are tile-layout-neutral. +They have no bearing on the internal storage of the arrays. The tiles +are stored internally in a range from \code{\varlink{bi}{bi}=(1:NTILES)} in the +$x$ axis, and the $y$ axis variable \varlink{bj}{bj} is assumed to +equal \code{1} throughout the package. \\ -\subsubsection{Arrays Indexed to Tile Number} +Arrays indexed to tile number: The following arrays are of length \code{NTILES} and are indexed to the tile number, which is indicated in the diagrams with the notation @@ -300,33 +321,33 @@ \varlink{exch2\_tny}{exch2_tny} express the $x$ and $y$ dimensions of each tile. At present for each tile \texttt{exch2\_tnx=sNx} and \texttt{exch2\_tny=sNy}, as assigned in \file{SIZE.h} and described in -section \ref{sec:exch2mpi} \sectiontitle{exch2, SIZE.h, and -multiprocessing}. Future releases of MITgcm are to allow varying tile +Section \ref{sec:exch2mpi} \sectiontitle{exch2, SIZE.h, and +Multiprocessing}. Future releases of MITgcm may allow varying tile sizes. \\ -The location of the tiles' Cartesian origin within a subdomain are -determined by the arrays \varlink{exch2\_tbasex}{exch2_tbasex} and -\varlink{exch2\_tbasey}{exch2_tbasey}. These variables are used to -relate the location of the edges of different tiles to each other. As +The arrays \varlink{exch2\_tbasex}{exch2_tbasex} and +\varlink{exch2\_tbasey}{exch2_tbasey} determine the tiles' +Cartesian origin within a subdomain +and locate the edges of different tiles relative to each other. As an example, in the default six-tile topology (Fig. \ref{fig:6tile}) each index in these arrays is set to \code{0} since a tile occupies its entire subdomain. The twenty-four-tile case discussed above will -have values of \code{0} or \code{16}, depending on the quadrant the -tile falls within the subdomain. The elements of the arrays +have values of \code{0} or \code{16}, depending on the quadrant of the +tile within the subdomain. The elements of the arrays \varlink{exch2\_txglobalo}{exch2_txglobalo} and \varlink{exch2\_txglobalo}{exch2_txglobalo} are similar to \varlink{exch2\_tbasex}{exch2_tbasex} and -\varlink{exch2\_tbasey}{exch2_tbasey}, but locate the tiles within the +\varlink{exch2\_tbasey}{exch2_tbasey}, but locate the tile edges within the global address space, similar to that used by global output and input files. \\ The array \varlink{exch2\_myFace}{exch2_myFace} contains the number of the subdomain of each tile, in a range \code{(1:6)} in the case of the standard cube topology and indicated by \textbf{\textsf{f}}$n$ in -figures \ref{fig:12tile} and -\ref{fig:24tile}. \varlink{exch2\_nNeighbours}{exch2_nNeighbours} -contains a count of the neighboring tiles each tile has, and is used -for setting bounds for looping over neighboring tiles. +figures \ref{fig:6tile} and +\ref{fig:48tile}. \varlink{exch2\_nNeighbours}{exch2_nNeighbours} +contains a count of the neighboring tiles each tile has, and sets +the bounds for looping over neighboring tiles. \varlink{exch2\_tProc}{exch2_tProc} holds the process rank of each tile, and is used in interprocess communication. \\ @@ -335,7 +356,7 @@ \varlink{exch2\_isEedge}{exch2_isEedge}, \varlink{exch2\_isSedge}{exch2_isSedge}, and \varlink{exch2\_isNedge}{exch2_isNedge} are set to \code{1} if the -indexed tile lies on the respective edge of a subdomain, \code{0} if +indexed tile lies on the edge of its subdomain, \code{0} if not. The values are used within the topology generator to determine the orientation of neighboring tiles, and to indicate whether a tile lies on the corner of a subdomain. The latter case requires special @@ -343,7 +364,7 @@ corners of the cube. \\ -\subsubsection{Arrays Indexed to Tile Number and Neighbor} +Arrays Indexed to Tile Number and Neighbor: The following arrays have vectors of length \code{MAX\_NEIGHBOURS} and \code{NTILES} and describe the orientations between the the tiles. \\ @@ -368,9 +389,10 @@ The arrays \varlink{exch2\_pi}{exch2_pi} and \varlink{exch2\_pj}{exch2_pj} specify the transformations of indices in exchanges between the neighboring tiles. These transformations are -necessary in exchanges between subdomains because the array index in -one dimension may map to the other index in an adjacent subdomain, and -may be have its indexing reversed. This swapping arises from the +necessary in exchanges between subdomains because a horizontal dimension +in one subdomain +may map to other horizonal dimension in an adjacent subdomain, and +may also have its indexing reversed. This swapping arises from the ``folding'' of two-dimensional arrays into a three-dimensional cube. \\ @@ -378,8 +400,8 @@ are the neighbor ID \code{N} and the tile number \code{T} as explained above, plus a vector of length \code{2} containing transformation factors \code{t}. The first element of the transformation vector -holds the factor to multiply the index in the same axis, and the -second element holds the the same for the orthogonal index. To +holds the factor to multiply the index in the same dimension, and the +second element holds the the same for the orthogonal dimension. To clarify, \code{exch2\_pi(1,N,T)} holds the mapping of the $x$ axis index of tile \code{T} to the $x$ axis of tile \code{T}'s neighbor \code{N}, and \code{exch2\_pi(2,N,T)} holds the mapping of \code{T}'s @@ -394,7 +416,7 @@ \code{(1,0)}, since all tiles on the same subdomain are oriented identically. An axis that corresponds to the orthogonal dimension with the same index direction in a particular tile-neighbor -orientation will have \code{(0,1)}. Those in the opposite index +orientation will have \code{(0,1)}. Those with the opposite index direction will have \code{(0,-1)} in order to reverse the ordering. \\ The arrays \varlink{exch2\_oi}{exch2_oi}, @@ -445,8 +467,8 @@ \varlink{exch2\_jthi\_c}{exch2_jthi_c} hold the location and index bounds of the edge segment of the neighbor tile \code{N}'s subdomain that gets exchanged with the local tile \code{T}. To take the example -of tile \code{T=2} in the twelve-tile topology -(Fig. \ref{fig:12tile}): \\ +of tile \code{T=2} in the forty-eight-tile topology +(Fig. \ref{fig:48tile}): \\ \begin{verbatim} exch2_itlo_c(4,2)=17 @@ -480,19 +502,20 @@ \code{Tn}'s $y$ axis corresponds to \code{T}'s $x$ axis, \code{T}'s northern edge exchanges with \code{Tn}'s western edge. The western edge of the tiles corresponds to the lower bound of the $x$ axis, so -\code{exch2\_itlo\_c} \code{exch2\_ithi\_c} are \code{0}. The range of +\code{exch2\_itlo\_c} and \code{exch2\_ithi\_c} are \code{0}, in the +western halo region of \code{Tn}. The range of \code{exch2\_jtlo\_c} and \code{exch2\_jthi\_c} correspond to the -width of \code{T}'s northern edge, plus the halo. \\ +width of \code{T}'s northern edge, expanded by one into the halo. \\ -\subsection{Key Routines} +\subsubsection{Key Routines} Most of the subroutines particular to exch2 handle the exchanges themselves and are of the same format as those described in -\ref{sect:cube_sphere_communication} \sectiontitle{Cube sphere +\ref{sec:cube_sphere_communication} \sectiontitle{Cube sphere communication}. Like the original routines, they are written as -templates which the local Makefile converts from RX into RL and RS -forms. \\ +templates which the local Makefile converts from \code{RX} into +\code{RL} and \code{RS} forms. \\ The interfaces with the core model subroutines are \code{EXCH\_UV\_XY\_RX}, \code{EXCH\_UV\_XYZ\_RX} and @@ -507,11 +530,18 @@ the singularities at the cube corners. \\ The separate scalar and vector forms of \code{EXCH2\_RX1\_CUBE} and -\code{EXCH2\_RX2\_CUBE} reflect that the vector-handling subrouine -needs to pass both the $u$ and $v$ components of the phsical vectors. -This arises from the topological folding discussed above, where the -$x$ and $y$ axes get swapped in some cases. This swapping is not an -issue with the scalar version. These subroutines call +\code{EXCH2\_RX2\_CUBE} reflect that the vector-handling subroutine +needs to pass both the $u$ and $v$ components of the physical vectors. +This swapping arises from the topological folding discussed above, where the +$x$ and $y$ axes get swapped in some cases, and is not an +issue with the scalar case. These subroutines call \code{EXCH2\_SEND\_RX1} and \code{EXCH2\_SEND\_RX2}, which do most of the work using the variables discussed above. \\ +\subsubsection{Experiments and tutorials that use exch2} +\label{sec:pkg:exch2:experiments} + +\begin{itemize} +\item{Held Suarez tutorial, in tutorial\_held\_suarez\_cs verification directory, +described in section \ref{sec:eg-hs} } +\end{itemize}