Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Include documentation for building/running SRW App on Mac #240

Merged
merged 114 commits into from
May 26, 2022
Merged
Show file tree
Hide file tree
Changes from 105 commits
Commits
Show all changes
114 commits
Select commit Hold shift + click to select a range
370bb6c
updated docs
gspetro Feb 11, 2022
a907487
added git submodule
gspetro Feb 11, 2022
b628dd6
fix formatting
gspetro Feb 11, 2022
467071f
added new submodule commits
gspetro Feb 11, 2022
de00e4c
fixed ref links
gspetro Feb 11, 2022
fb34100
finished Intro
gspetro Feb 11, 2022
701f9e9
finish Components & Intro edits
gspetro Feb 14, 2022
cab1c6f
edited Rocoto workflow section of Quickstart
gspetro Feb 14, 2022
290364e
added minor hpc submodule commits
gspetro Feb 15, 2022
80291a1
Updates to Rocoto Workflow in Quick Start
gspetro Feb 16, 2022
ab97b74
add to HPC-stack intro
gspetro Feb 16, 2022
8056200
submodule updates
gspetro Feb 16, 2022
17504fc
added submodule docs edits
gspetro Feb 17, 2022
357e151
hpc-stack updates & formatting fixes
gspetro Feb 17, 2022
acf555b
hpc-stack intro edits
gspetro Feb 17, 2022
36349a6
bibtex attempted fix
gspetro Feb 18, 2022
838271f
add hpc-stack module edits
gspetro Feb 18, 2022
863b7de
update sphinxcontrib version
gspetro Feb 22, 2022
2b100d9
add .readthedocs.yaml file
gspetro Feb 22, 2022
9e58e67
update .readthedocs.yaml file
gspetro Feb 22, 2022
1830b49
update .readthedocs.yaml file
gspetro Feb 22, 2022
54a647e
update conf.py
gspetro Feb 22, 2022
46d381f
updates .readthedocs.yaml with submodules
gspetro Feb 22, 2022
91af03d
updates .readthedocs.yaml with submodules
gspetro Feb 22, 2022
97616fd
submodule updates
gspetro Feb 22, 2022
21d3e27
submodule updates
gspetro Feb 22, 2022
5af69e5
minor Intro edits
gspetro Feb 23, 2022
ee901e6
minor Intro edits
gspetro Feb 23, 2022
f77cba9
minor Intro edits
gspetro Feb 23, 2022
bc0748c
submodule updates
gspetro Feb 23, 2022
fef6d27
fixed typos in QS
gspetro Feb 23, 2022
0d16101
QS updates
gspetro Feb 24, 2022
418a40b
QS updates
gspetro Feb 24, 2022
77d565d
QS updates
gspetro Feb 25, 2022
2e1a03f
updates to InputOutput and QS
gspetro Feb 25, 2022
80519d4
fix I/O doc typos
gspetro Feb 25, 2022
6f11030
pull updates to hpc-stack docs
gspetro Feb 28, 2022
999a417
pull updates to hpc-stack docs
gspetro Mar 1, 2022
f07fe8a
fix table wrapping
gspetro Mar 1, 2022
14db051
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Mar 3, 2022
b58d661
updates to QS for cloud
gspetro Mar 3, 2022
301ff5f
Merge branch 'develop' of github.com:gspetro-NOAA/ufs-srweather-app i…
gspetro Mar 3, 2022
0b50e04
fix QS export statements
gspetro Mar 3, 2022
8786b32
fix QS export statements
gspetro Mar 3, 2022
3a442d6
QS edits on bind, config
gspetro Mar 3, 2022
0cae160
add bullet points to notes
gspetro Mar 4, 2022
27247a5
running without rocoto
gspetro Mar 4, 2022
805bb81
add HPC-Stack submodule w/docs
gspetro Mar 4, 2022
29cf292
split QS into container/non-container approaches
gspetro Mar 8, 2022
3e30098
added filepath changes for running in container on Orion, et al.
gspetro Mar 9, 2022
53807fa
edits to overview and container QS
gspetro Mar 10, 2022
93bfe9b
moved CodeReposAndDirs.rst info to the Introduction & deleted file
gspetro Mar 11, 2022
eb00397
continued edits to SRWAppOverview
gspetro Mar 11, 2022
f4d2043
combine overview w/non-container docs
gspetro Mar 15, 2022
d1addf8
finish merging non-container guide & SRWOverview, rename/remove files…
gspetro Mar 16, 2022
fc1a1d4
minor edits for Intro & QS
gspetro Mar 17, 2022
acb77c8
updates to BuildRun doc through 3.8.1
gspetro Mar 17, 2022
70a051b
edits to Build/Run and Components
gspetro Mar 17, 2022
99127e7
remove .gitignore
gspetro Mar 18, 2022
b01268d
fix Ch 3 title, 4 supported platform levels note
gspetro Mar 18, 2022
da35184
fix typos, add term links
gspetro Mar 18, 2022
1302868
other minor fixes/suggestions implemented
gspetro Mar 18, 2022
a704a2f
updated Intro based on feedback; changed SRW to SRW App throughout
gspetro Mar 21, 2022
7fc263d
update comment to Intro citation
gspetro Mar 21, 2022
496fcb3
Merge branch 'develop' into develop
gspetro-NOAA Mar 22, 2022
10de71f
add user-defined vertical levels to future work
gspetro Mar 22, 2022
16b0c1a
Merge branch 'develop' of github.com:gspetro-NOAA/ufs-srweather-app i…
gspetro Mar 22, 2022
e294020
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Mar 23, 2022
92bddca
Add instructions for srw_common module load
gspetro Mar 23, 2022
698613b
Pull changes from upstream
gspetro Mar 23, 2022
6fa5074
fix typo
gspetro Mar 23, 2022
a5ae76e
update Intro & BuildRunSRW based on Mark's feedback
gspetro Mar 23, 2022
ea17b19
minor intro updates
gspetro Mar 23, 2022
1aa9322
1st round of jwolff's edits
gspetro Mar 25, 2022
3d1cddb
2nd round of jwolff updates
gspetro Mar 28, 2022
173b838
update QS intro
gspetro Mar 29, 2022
09581c8
fix minor physics details
gspetro Mar 29, 2022
a714d43
update citation and physics suite name
gspetro Mar 29, 2022
4757b40
add compute node allocation info to QS
gspetro Mar 29, 2022
2c68823
add authoritative hpc-stack docs to Intro
gspetro Mar 29, 2022
a9702bb
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Mar 30, 2022
9d9e079
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Apr 6, 2022
091574a
create MacOS install/build instructions
gspetro Apr 6, 2022
a5fced9
Merge branch 'develop' of github.com:gspetro-NOAA/ufs-srweather-app i…
gspetro Apr 7, 2022
950a4b7
add MacOS Build/Run instructions
gspetro Apr 7, 2022
713c546
fix MacOS Build/Run details
gspetro Apr 7, 2022
8bc2f6d
add MacOS info directly to Build/Run SRW chapter
gspetro Apr 7, 2022
2d62840
minor details
gspetro Apr 7, 2022
b33d630
minor edits
gspetro Apr 8, 2022
67dbcf9
update Include-HPCInstall with mac installation docs
gspetro Apr 15, 2022
f9adea1
add note re: Terminal.app & bash shell in MacOS section
gspetro Apr 15, 2022
fb014c3
remove MacInstall file-contents added to BuildRunSRW
gspetro Apr 15, 2022
deada14
update hpc-stack submodule to include mac installation info
gspetro Apr 20, 2022
376fe7a
add MacOS config details
gspetro Apr 20, 2022
c8b63dc
add MacOS config & run details
gspetro Apr 21, 2022
1b4135f
minor MacOS note
gspetro Apr 21, 2022
7a61473
mention need to verify software library version #'s
gspetro Apr 27, 2022
535c5c3
Merge branch 'develop' into textonly/mac
gspetro-NOAA May 3, 2022
ccd9e21
update hpc-stack-mod
gspetro May 5, 2022
c742c09
Merge branch 'develop' into textonly/mac
gspetro-NOAA May 5, 2022
8cf44d2
align MacDetails section with PR #238 info
gspetro May 5, 2022
d1267a2
Merge branch 'textonly/mac' of github.com:gspetro-NOAA/ufs-srweather-…
gspetro May 5, 2022
a4af2ed
remove gsed & alter related commands
gspetro May 5, 2022
528fa72
update hpc-stack submodule
gspetro May 6, 2022
25ccf68
typos
gspetro May 6, 2022
744f1bb
switch from env to module load
gspetro-NOAA May 10, 2022
2b48a39
Update BuildRunSRW.rst
gspetro-NOAA May 11, 2022
cb1a6bc
update hpc-stack module docs & MacOS config.sh
gspetro May 13, 2022
b4ce20c
Merge branch 'textonly/mac' of github.com:gspetro-NOAA/ufs-srweather-…
gspetro May 13, 2022
21a9975
update machine file instructions
gspetro May 13, 2022
2c32a67
Merge branch 'ufs-community:develop' into textonly/mac
gspetro-NOAA May 16, 2022
fd58cef
updates to BuildRun chapter
gspetro May 16, 2022
956a7fa
fix typo
gspetro May 24, 2022
3cee7ef
Merge branch 'develop' into textonly/mac
gspetro-NOAA May 24, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
233 changes: 216 additions & 17 deletions docs/UsersGuide/source/BuildRunSRW.rst
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,11 @@ The SRW Application source code is publicly available on GitHub. To download the
COMMENT: This will need to be changed to the updated release branch of the SRW repo once it exists.

The cloned repository contains the configuration files and sub-directories shown in
:numref:`Table %s <FilesAndSubDirs>`.
:numref:`Table %s <FilesAndSubDirs>`. The user may set an ``$SRW`` environmental variable to point to the location of the new ``ufs-srweather-app`` repository. For example, if ``ufs-srweather-app`` was cloned into the $HOME directory:

.. code-block:: console

export SRW=$HOME/ufs-srweather-app

.. _FilesAndSubDirs:

Expand Down Expand Up @@ -120,15 +124,14 @@ Run the executable that pulls in SRW App components from external repositories:

.. code-block:: console

cd ufs-srweather-app
cd $SRW
./manage_externals/checkout_externals



Build with ``devbuild.sh``
==========================

On Level-1 systems, for which a modulefile is provided under ``modulefiles`` directory, we can build SRW App binaries with:
On Level 1 systems, a modulefile is provided under ``modulefiles`` directory, which users can run to build the SRW App binaries:

.. code-block:: console

Expand All @@ -140,46 +143,114 @@ If compiler auto-detection fails for some reason, specify it using

./devbuild.sh --platform=hera --compiler=intel

If this method doesn't work, we will have to manually setup the environment, and build SRW app binaries with CMake.
If this method doesn't work, users will have to manually setup the environment and build the SRW App binaries with CMake.

..
COMMENT: What would this entail?!

.. _SetUpBuild:

Set up the Build/Run Environment
================================

We need to setup our environment to run a workflow or to build the SRW app with CMake. Note that ``devbuild.sh`` does not prepare environment for workflow runs so this step is necessary even though binaries are built properly using ``devbuild.sh``.
Before building the SRW App, users must set up their environment to run a workflow or to build the SRW App with CMake. Note that ``devbuild.sh`` does not prepare environment for workflow runs, so this step is necessary even though binaries are built properly using ``devbuild.sh``.

The build environment must be set up for the user's specific platform. First, we need to make sure ``Lmod`` is the app used for loading modulefiles. That is often the case on most systems, however, on some systems such as Gaea/Odin, the default modulefile loader is from Cray and we need to swap it for ``Lmod``. For example on Gaea, assuming a ``bash`` login shell, run:

.. code-block:: console

source etc/lmod-setup.sh gaea

or if your login shell is ``csh`` or ``tcsh``, source ``etc/lmod-setup.csh`` instead. If you execute the above command on systems that don't need it, it will simply do a ``module purge``. From here on, we can assume, ``Lmod`` is ready to load modulefiles needed by the SRW app.
or if your login shell is ``csh`` or ``tcsh``, source ``etc/lmod-setup.csh`` instead. If you execute the above command on systems that don't need it, it will simply do a ``module purge``. From here on, we can assume that ``Lmod`` is ready to load modulefiles needed by the SRW App.

The modulefiles needed for building and running SRW App are located in ``modulefiles`` directory. To load the necessary modulefile for a specific ``<platform>`` using ``<compiler>`` , run:

.. code-block:: console

module use <path/to/modulefiles/directory>
module use <path/to/ufs-srweather-app/modulefiles>
module load build_<platform>_<compiler>

where ``<path/to/modulefiles/directory>`` is the full path to the ``modulefiles`` directory. This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory.

On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build_<platform>_<compiler>`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively:
On Level 2-4 systems, users will need to modify certain environment variables, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build_<platform>_<compiler>`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively:

.. code-block::

export <VARIABLE_NAME>=<PATH_TO_MODULE>
setenv <VARIABLE_NAME> <PATH_TO_MODULE>

.. _MacDetails:

Additional Details for Building on MacOS
------------------------------------------

.. note::
Users not building the SRW App to run on MacOS may skip to the :ref:`next section <BuildExecutables>`.

The SRW App can be built on MacOS systems, presuming HPC-Stack has already been successfully installed. The following two options have been tested:

* **Option 1:** MacBookAir 2020, M1 chip (arm64, running natively), 4+4 cores, Big Sur 11.6.4, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed

* **Option 2:** MacBook Pro 2015, 2.8 GHz Quad-Core Intel Core i7 (x86_64), Catalina OS X 10.15.7, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed

.. note::
Examples in this subsection presume that the user is running Terminal.app with a bash shell environment. If this is not the case, users will need to adjust the commands to fit their command line application and shell environment.

The ``build_macos_gnu`` modulefile initializes the module environment, lists the location of HPC-Stack modules, loads the meta-modules and modules, and sets compilers, additional flags, and environment variables needed for building the SRW App. The modulefile must be modified to include the absolute path to the user's HPC-Stack installation and ``ufs-srweather-app`` directories. In particular, the following section must be modified:

.. code-block:: console

# This path should point to your HPCstack installation directory
setenv HPCstack "/Users/username/hpc-stack/install"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

export HPCstack=/Users/username/hpc-stack/install

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TCL modulefiles use setenv instead of export so I think it is fine as it is.


# This path should point to your SRW Application directory
setenv SRW "/Users/username/ufs-srweather-app"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

export SRW=/Users/username/ufs-srweather-app


Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The syntax above and below is for the csh. Instructions for the MacOS followed use of bash shell, as is stated in the line 197 above! and the build_macos_gnu needs to use bash syntax to be consistent. The changes for the bash shell syntax to be made for the lines 204, 207, 214-216, 219-221, are shown after each corresponding line.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the motivation for using modulefiles is to set the environment irrespective of the login shell used (bash or tcsh).

  module load build_macos_gnu

should work for bash even though it is written in "csh-like" syntax. The only place where login shell dependent code is needed is when loading Lmod itself i.e.

  source etc/lmod-setup.sh macos
             or
  source etc/lmod-setup.csh macos

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@natalie-perlin This code is already in the modulefile, which is written in csh-like syntax. In reality, the user is only changing the path. If we change this to bash, we have to redo the entire modulefile, which I don't think is worth it since, as @danielabdi-noaa pointed out, the modulefile should work regardless of the user's shell.

An excerpt of the ``build_macos_gnu`` contents appears below for Option 1. To use Option 2, the user will need to comment out the lines specific to Option 1 and uncomment the lines specific to Option 2 in the ``build_macos_gnu`` modulefile. Additionally, users need to verify that all file paths reflect their system's configuration and that the correct version numbers for software libraries appear in the modulefile.

.. code-block:: console

# Option 1 compiler paths:
setenv CC "/opt/homebrew/bin/gcc"
setenv FC "/opt/homebrew/bin/gfortran"
setenv CXX "/opt/homebrew/bin/g++"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

export CC=/opt/homebrew/bin/gcc
export CXX=/opt/homebrew/bin/g++
export FC=/opt/homebrew/bin/gfortran


# Option 2 compiler paths:
#setenv CC "/usr/local/bin/gcc"
#setenv FC "/usr/local/bin/gfortran"
#setenv CXX "/usr/local/bin/g++"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#export CC=/usr/local/bin/gcc
#export CXX=/usr/local/bin/g++
#export FC=/usr/local/bin/gfortran


Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, the following were set for the in build_macos_gnu after lines 222 (again, for bash shell syntax):

export MPI_CC=mpicc
export MPI_CXX=mpicxx
export MPI_FC=mpif90

export CMAKE_C_COMPILER=${MPI_CC}
export CMAKE_CXX_COMPILER=${MPI_CXX}
export CMAKE_Fortran_COMPILER=${MPI_FC}
export CMAKE_Platform=macos.gnu
export CMAKE_Fortran_COMPILER_ID="GNU"
export LDFLAGS="-L${MPI_ROOT}/lib"
export FFLAGS="-DNO_QUAD_PRECISION -fallow-argument-mismatch "

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@natalie-perlin All of this is in the modulefile already, so I did not type it out in the instructions because the user should not have to adjust the code in any way. As mentioned above, it doesn't need to get changed to bash syntax because (if I understood @danielabdi-noaa correctly), the modulefile will work regardless of the user's shell.

Then, users must source the Lmod setup file, just as they would on other systems, and load the modulefiles needed for building and running SRW App:

.. code-block:: console

source etc/lmod-setup.sh
gspetro-NOAA marked this conversation as resolved.
Show resolved Hide resolved
module use <path/to/ufs-srweather-app/modulefiles>
module load build_macos_gnu

.. note::
If you execute ``source etc/lmod-setup.sh`` on systems that don't need it, it will simply do a ``module purge``.

Additionally, for Option 1 systems, set the variable ``ENABLE_QUAD_PRECISION`` to ``OFF`` in line 35 of the ``$SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt`` file. This change is optional if using Option 2 to build the SRW App. Using a text editor (e.g., vi, vim, emacs):

.. code-block:: console

option(ENABLE_QUAD_PRECISION "Enable compiler definition -DENABLE_QUAD_PRECISION" OFF)

An alternative way to make this change is using a `sed` (streamline editor). From the command line, users can run one of two commands (user's preference):

.. code-block:: console

sed -i -e 's/QUAD_PRECISION\" ON)/QUAD_PRECISION\" OFF)/' CMakeLists.txt
sed -i -e 's/bin\/sh/bin\/bash/g' *sh


.. _BuildExecutables:

Build the Executables
=======================

Create a directory to hold the build's executables:
Create a directory within ``$SRW`` to hold the build's executables:

.. code-block:: console

Expand Down Expand Up @@ -528,13 +599,17 @@ To get started, make a copy of ``config.community.sh``. From the ``ufs-srweather

.. code-block:: console

cd regional_workflow/ush
cd $SRW/regional_workflow/ush
cp config.community.sh config.sh

The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization.

Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``.
gspetro-NOAA marked this conversation as resolved.
Show resolved Hide resolved

.. note::

MacOS users should refer to :numref:`Section %s <MacConfig>` for details on configuring an experiment on MacOS.

Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow <ConfigWorkflow>`, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids <LAMGrids>`.

.. important::
Expand Down Expand Up @@ -630,8 +705,7 @@ For WCOSS_CRAY:
.. note::

The values of the configuration variables should be consistent with those in the
``valid_param_vals script``. In addition, various example configuration files can be
found in the ``regional_workflow/tests/baseline_configs`` directory.
``valid_param_vals script``. In addition, various example configuration files can be found in the ``regional_workflow/tests/baseline_configs`` directory.

.. _VXConfig:

Expand Down Expand Up @@ -686,8 +760,8 @@ These tasks are independent, so users may set some values to "TRUE" and others t

.. _SetUpPythonEnv:

Set up the Python and other Environment Parameters
--------------------------------------------------
Set Up the Python and Other Environment Parameters
----------------------------------------------------
The workflow requires Python 3 with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``):

.. code-block:: console
Expand All @@ -702,6 +776,126 @@ This command will activate the ``regional_workflow`` conda environment. The user
source ~/.bashrc
conda activate regional_workflow

.. _MacConfig:

Configuring an Experiment on MacOS
------------------------------------------------------------

In principle, the configuration process for MacOS systems is the same as for other systems. However, the details of the configuration process on MacOS require a few extra steps.

.. _MacMorePackages:

Install Additional Packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Check the version of bash, and upgrade it if it is lower than 4. Additionally, install the ``coreutils`` package:

.. code-block:: console

bash --version
brew upgrade bash
brew install coreutils

.. _MacVEnv:

Create a Python Virtual Environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Users must create a python virtual environment for running the SRW on MacOS. This involves setting python3 as default, adding required python modules, and sourcing the ``regional_workflow``.

.. code-block:: console

python3 -m pip --version
python3 -m pip install --upgrade pip
python3 -m ensurepip --default-pip
python3 -m venv $HOME/venv/regional_workflow
source $HOME/venv/regional_workflow/bin/activate
python3 -m pip install jinja2
python3 -m pip install pyyaml
python3 -m pip install f90nml
python3 -m pip install ruby OR: brew install ruby

The virtual environment can be deactivated by running the ``deactivate`` command. The virtual environment built here will be reactivated in :numref:`Step %s <MacActivateWFenv>` and needs to be used to generate the workflow and run the experiment.

Install Rocoto
^^^^^^^^^^^^^^^^^^

.. note::
Users may `install Rocoto <https://github.com/christopherwharrop/rocoto/blob/develop/INSTALL>`__ if they want to make use of a workflow manager to run their experiments. However, this option has not been tested yet on MacOS and is not supported for this release.


Configure the SRW
^^^^^^^^^^^^^^^^^^^^

Users will need to configure their experiment just like on any other system. From the ``$SRW/regional_workflow/ush`` directory, users can copy the settings from ``config.community.sh`` into a ``config.sh`` file (see :numref:`Section %s <UserSpecificConfig>`) above. In the ``config.sh`` file, users should set ``MACHINE="macos"`` and modify additional variables as needed. For example:

.. code-block:: console

MACHINE="macos"
ACCOUNT="user"
EXPT_SUBDIR="<test_community>"
COMPILER="gnu"
VERBOSE="TRUE"
RUN_ENVIR="community"
PREEXISTING_DIR_METHOD="rename"

PREDEF_GRID_NAME="RRFS_CONUS_25km"
QUILTING="TRUE"

Due to the limited number of processors on Mac OS systems, users must configure the domain decomposition defaults (usually, there are only 8 CPUs in M1-family chips and 4 CPUs for x86_64).

For :ref:`Option 1 <MacDetails>`, add the following information to ``config.sh``:

.. code-block:: console

LAYOUT_X="${LAYOUT_X:-3}"
LAYOUT_Y="${LAYOUT_Y:-2}"
WRTCMP_write_groups="1"
WRTCMP_write_tasks_per_group="2"

For :ref:`Option 2 <MacDetails>`, add the following information to ``config.sh``:

.. code-block:: console

LAYOUT_X="${LAYOUT_X:-3}"
LAYOUT_Y="${LAYOUT_Y:-1}"
WRTCMP_write_groups="1"
WRTCMP_write_tasks_per_group="1"

Configure the Machine File
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Configure the machine file based on the number of CPUs in the system (8 or 4). Specify the following variables in ``$SRW/regional_workflow/ush/machine/macos.sh``:

For Option 1 (8 CPUs):

.. code-block:: console

# Architecture information
WORKFLOW_MANAGER="none"
NCORES_PER_NODE=${NCORES_PER_NODE:-8} (Option 2: when 4 CPUs, set to 4)
SCHED=${SCHED:-"none"}
# Run commands for executables
RUN_CMD_SERIAL="time"
RUN_CMD_UTILS="mpirun -np 4"
RUN_CMD_FCST='mpirun -np ${PE_MEMBER01}'
RUN_CMD_POST="mpirun -np 4"

The same settings can be used for Option 2, except that ``NCORES_PER_NODE=${NCORES_PER_NODE:-8}`` should be set to 4 instead of 8.

.. _MacActivateWFenv:

Activate the Workflow Environment
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

The ``regional_workflow`` environment can be activated on MacOS as it is for any other system:

.. code-block:: console

cd $SRW/regional_workflow/ush
source ../../env/wflow_macos.env
gspetro-NOAA marked this conversation as resolved.
Show resolved Hide resolved

This should activate the ``regional_workflow`` environment created in :numref:`Step %s <MacVEnv>`. From here, the user may continue to the :ref:`next step <GenerateWorkflow>` and generate the regional workflow.


.. _GenerateWorkflow:

Expand Down Expand Up @@ -891,7 +1085,11 @@ In addition to the baseline tasks described in :numref:`Table %s <WorkflowTasksT
.. _RocotoRun:

Run the Workflow Using Rocoto
=============================
==============================

.. attention::
If users are running the SRW App in a container or on a system that does not have Rocoto installed (e.g., `Level 3 & 4 <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__ systems, such as MacOS), they should follow the process outlined in :numref:`Section %s <RunUsingStandaloneScripts>` instead of the instructions in this section.

The information in this section assumes that Rocoto is available on the desired platform. (Note that Rocoto cannot be used when running the workflow within a container.) If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts according to the process outlined in :numref:`Section %s <RunUsingStandaloneScripts>`. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab.

Optionally, an environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows:
Expand All @@ -904,7 +1102,7 @@ If the login shell is csh/tcsh, it can be set using:

.. code-block:: console

setenv EXPTDIR /path-to-experiment/directory
setenv EXPTDIR /<path-to-experiment>/<directory_name>


Launch the Rocoto Workflow Using a Script
Expand Down Expand Up @@ -1067,6 +1265,7 @@ After finishing the experiment, open the crontab using ``crontab -e`` and delete

On Orion, *cron* is only available on the orion-login-1 node, so users will need to work on that node when running *cron* jobs on Orion.


The workflow run is complete when all tasks have "SUCCEEDED", and the rocotostat command outputs a table similar to the one :ref:`above <Success>`.

.. _PlotOutput:
Expand Down
1 change: 1 addition & 0 deletions docs/UsersGuide/source/Include-HPCInstall.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
.. _InstallHPCstack:

.. include:: ../../../hpc-stack-mod/docs/source/hpc-install.rst
.. include:: ../../../hpc-stack-mod/docs/source/mac-install.rst

.. include:: ../../../hpc-stack-mod/docs/source/hpc-prereqs.rst
.. include:: ../../../hpc-stack-mod/docs/source/hpc-parameters.rst
Expand Down
6 changes: 6 additions & 0 deletions docs/UsersGuide/source/Quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -334,6 +334,12 @@ Check the batch script output file in your experiment directory for a “SUCCESS
| | | | forecast hour) |
+------------+------------------------+----------------+----------------------------+

Users can access log files for specific tasks in the ``$EXPTDIR/log`` directory. To see how the experiment is progressing, users can also check the end of the ``log.launch_FV3LAM_wflow`` file from the command line:

.. code-block:: console

tail -n 40 log.launch_FV3LAM_wflow

.. hint::
If any of the scripts return an error that "Primary job terminated normally, but one process returned a non-zero exit code," there may not be enough space on one node to run the process. On an HPC system, the user will need to allocate a(nother) compute node. The process for doing so is system-dependent, and users should check the documentation available for their HPC system. Instructions for allocating a compute node on NOAA Cloud systems can be viewed in the :numref:`Step %s <WorkOnHPC>` as an example.

Expand Down
4 changes: 2 additions & 2 deletions docs/UsersGuide/source/WE2Etests.rst
Original file line number Diff line number Diff line change
Expand Up @@ -344,7 +344,7 @@ to suit specific testing needs.
.. _ModExistingTest:

Modifying an Existing Test
---------------------
-----------------------------
To modify an existing test, simply edit the configuration file for that test by changing
existing variable values and/or adding new variables to suit the requirements of the
modified test. Such a change may also require modifications to the test description
Expand Down Expand Up @@ -372,7 +372,7 @@ above, say ``wflow_features``:
.. _AddNewCategory:

Adding a New WE2E Test Category
-----------------------------
-----------------------------------
To create a new test category called, e.g. ``new_category``:

1) In the directory ``ufs-srweather-app/regional_workflow/tests/WE2E/test_configs``,
Expand Down
Loading