Skip to content

Commit

Permalink
Update SRW Documentation (#212)
Browse files Browse the repository at this point in the history
* updated docs

* added git submodule

* fix formatting

* added new submodule commits

* fixed ref links

* finished Intro

* finish Components & Intro edits

* edited Rocoto workflow section of Quickstart

* added minor hpc submodule commits

* Updates to Rocoto Workflow in Quick Start

* add to HPC-stack intro

* submodule updates

* added submodule docs edits

* hpc-stack updates & formatting fixes

* hpc-stack intro edits

* bibtex attempted fix

* add hpc-stack module edits

* update sphinxcontrib version

* add .readthedocs.yaml file

* update .readthedocs.yaml file

* update .readthedocs.yaml file

* update conf.py

* updates .readthedocs.yaml with submodules

* updates .readthedocs.yaml with submodules

* submodule updates

* submodule updates

* minor Intro edits

* minor Intro edits

* minor Intro edits

* submodule updates

* fixed typos in QS

* QS updates

* QS updates

* QS updates

* updates to InputOutput and QS

* fix I/O doc typos

* pull updates to hpc-stack docs

* pull updates to hpc-stack docs

* fix table wrapping

* updates to QS for cloud

* fix QS export statements

* fix QS export statements

* QS edits on bind, config

* add bullet points to notes

* running without rocoto

* add HPC-Stack submodule w/docs

* split QS into container/non-container approaches

* added filepath changes for running in container on Orion, et al.

* edits to overview and container QS

* moved CodeReposAndDirs.rst info to the Introduction & deleted file

* continued edits to SRWAppOverview

* combine overview w/non-container docs

* finish merging non-container guide & SRWOverview, rename/remove files, update FAQ

* minor edits for Intro & QS

* updates to BuildRun doc through 3.8.1

* edits to Build/Run and Components

* remove .gitignore

* fix Ch 3 title, 4 supported platform levels note

* fix typos, add term links

* other minor fixes/suggestions implemented

* updated Intro based on feedback; changed SRW to SRW App throughout

* update comment to Intro citation

* add user-defined vertical levels to future work

* Add instructions for srw_common module load

* fix typo

* update Intro & BuildRunSRW based on Mark's feedback

* minor intro updates

* 1st round of jwolff's edits

* 2nd round of jwolff updates

* update QS intro

* fix minor physics details

* update citation and physics suite name

* add compute node allocation info to QS

* add authoritative hpc-stack docs to Intro

Co-authored-by: gspetro <gillian.s.petro@gmail.com>
  • Loading branch information
gspetro-NOAA and gspetro authored Mar 30, 2022
1 parent 5c17b2c commit 31dab61
Show file tree
Hide file tree
Showing 20 changed files with 1,786 additions and 1,535 deletions.
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[submodule "hpc-stack-mod"]
path = hpc-stack-mod
url = https://github.com/NOAA-EMC/hpc-stack.git
35 changes: 35 additions & 0 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# .readthedocs.yaml
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2

# Set the version of Python and other tools you might need
build:
os: ubuntu-20.04
tools:
python: "3.9"
# You can also specify other tool versions:
# nodejs: "16"
# rust: "1.55"
# golang: "1.17"

# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/UsersGuide/source/conf.py

# If using Sphinx, optionally build your docs in additional formats such as PDF
# formats:
# - pdf

# Optionally declare the Python requirements required to build your docs
python:
install:
- requirements: docs/UsersGuide/requirements.txt

submodules:
include:
- hpc-stack-mod
recursive: true

4 changes: 0 additions & 4 deletions docs/UsersGuide/build/.gitignore

This file was deleted.

886 changes: 886 additions & 0 deletions docs/UsersGuide/source/BuildRunSRW.rst

Large diffs are not rendered by default.

261 changes: 0 additions & 261 deletions docs/UsersGuide/source/CodeReposAndDirs.rst

This file was deleted.

80 changes: 80 additions & 0 deletions docs/UsersGuide/source/Components.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
.. _Components:

============================
SRW Application Components
============================

The SRW Application assembles a variety of components, including:

* Pre-processor Utilities & Initial Conditions
* UFS Weather Forecast Model
* Unified Post-Processor
* Visualization Examples
* Build System and Workflow

These components are documented within this User's Guide and supported through a `community forum <https://forums.ufscommunity.org/>`_.

.. _Utils:

Pre-processor Utilities and Initial Conditions
==============================================

The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User's Guide <https://noaa-emcufs-utils.readthedocs.io/en/latest/>`_.

The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates.

.. WARNING::
For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `National Centers for Environmental Information <https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/global-forcast-system-gfs>`_ (NCEI) or through the `NOAA Operational Model Archive and Distribution System <https://nomads.ncep.noaa.gov/>`_ (NOMADS). Raw external model data may be pre-staged on disk by the user.


Forecast Model
==============

The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere
(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here <https://ufs-weather-model.readthedocs.io/en/latest/>`__.

Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here <https://noaa-emc.github.io/FV3_Dycore_ufs-v2.0.0/html/index.html>`__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website <https://www.gfdl.noaa.gov/fv3/>`_.

Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here <https://dtcenter.org/community-code/common-community-physics-package-ccpp>`__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation <https://dtcenter.ucar.edu/GMTB/v5.0.0/sci_doc/index.html>`_, and CCPP technical aspects are described in the `CCPP Technical Documentation <https://ccpp-techdoc.readthedocs.io/en/v5.0.0/>`_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites.

The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction.

Post-processor
==============

The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide <https://upp.readthedocs.io/en/latest/>`__.

Output from UPP can be used with visualization, plotting, and verification packages or in
further downstream post-processing (e.g., statistical post-processing techniques).

Visualization Example
=====================
A Python script is provided to create basic visualization of the model output. The script
is designed to output graphics in PNG format for 14 standard meteorological variables
when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results.

After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/regional_workflow/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s <Graphics>` and are also included at the top of the script.

Build System and Workflow
=========================

The SRW Application has a portable build system and a user-friendly, modular, and
expandable workflow framework.

An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :numref:`Chapter %s: Installing the HPC-Stack <InstallHPCStack>`). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, C, and C++ compiler, and an :term:`MPI` library.

Once built, the provided experiment generator script can be used to create a Rocoto-based
workflow file that will run each task in the system in the proper sequence (see `Rocoto documentation
<https://github.com/christopherwharrop/rocoto/wiki/Documentation>`_). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps.

This SRW Application release has been tested on a variety of platforms widely used by
researchers, such as the NOAA Research and Development High-Performance Computing Systems
(RDHPCS), including Hera, Orion, and Jet; NOAA’s Weather and Climate Operational
Supercomputing System (WCOSS); the National Center for Atmospheric Research (NCAR) Cheyenne
system; the National Severe Storms Laboratory (NSSL) HPC machine, Odin; the National Science Foundation Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below.

On pre-configured (Level 1) computational platforms, all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these pre-configured platforms.

A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built.

Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application wiki page <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`_.
11 changes: 6 additions & 5 deletions docs/UsersGuide/source/ConfigNewPlatform.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
Configuring a New Platform
==========================

The UFS SRW Application has been designed to work primarily on a number of Level 1 and 2 support platforms, as specified `here <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`_. However, it is also designed with flexibility in mind, so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. A full list of prerequisites for installing the UFS SRW App and running the Graduate Student Test can be found in :numref:`Section %s <SW-OS-Requirements>`.
The UFS SRW Application has been designed to work primarily on a number of Level 1 and 2 support platforms, as specified `here <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__. However, it is also designed with flexibility in mind, so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. A full list of prerequisites for installing the UFS SRW App and running the Graduate Student Test can be found in :numref:`Section %s <SW-OS-Requirements>`.

The first step to installing on a new machine is to install :term:`NCEPLIBS` (https://github.com/NOAA-EMC/NCEPLIBS), the NCEP libraries package, which is a set of libraries created and maintained by NCEP and EMC that are used in many parts of the UFS. NCEPLIBS comes with a large number of prerequisites (see :numref:`Section %s <SW-OS-Requirements>` for more info), but the only required software prior to starting the installation process are as follows:

Expand Down Expand Up @@ -57,7 +57,7 @@ However, it is also possible to install these utilities via Macports (https://ww

Installing NCEPLIBS-external
============================
In order to facilitate the installation of NCEPLIBS (and therefore, the SRW and other UFS applications) on new platforms, EMC maintains a one-stop package containing most of the prerequisite libraries and software necessary for installing NCEPLIBS. This package is known as NCEPLIBS-external, and is maintained in a git repository at https://github.com/NOAA-EMC/NCEPLIBS-external. Instructions for installing these will depend on your platform, but generally so long as all the above-mentioned prerequisites have been installed you can follow the proceeding instructions verbatim (in bash; a csh-based shell will require different commands). Some examples for installing on specific platforms can be found in the `NCEPLIBS-external/doc directory <https://github.com/NOAA-EMC/NCEPLIBS-external/tree/release/public-v2/doc>`.
In order to facilitate the installation of NCEPLIBS (and therefore, the SRW App and other UFS applications) on new platforms, EMC maintains a one-stop package containing most of the prerequisite libraries and software necessary for installing NCEPLIBS. This package is known as NCEPLIBS-external, and is maintained in a git repository at https://github.com/NOAA-EMC/NCEPLIBS-external. Instructions for installing these will depend on your platform, but generally so long as all the above-mentioned prerequisites have been installed you can follow the proceeding instructions verbatim (in bash; a csh-based shell will require different commands). Some examples for installing on specific platforms can be found in the `NCEPLIBS-external/doc directory <https://github.com/NOAA-EMC/NCEPLIBS-external/tree/release/public-v2/doc>`.


These instructions will install the NCEPLIBS-external in the current directory tree, so be sure you are in the desired location before starting.
Expand Down Expand Up @@ -126,8 +126,8 @@ Further information on including prerequisite libraries, as well as other helpfu

Once the NCEPLIBS package has been successfully installed, you can move on to building the UFS SRW Application.

Building the UFS Short-Range Weather Application (UFS SRW App)
==============================================================
Building the UFS SRW Application
=======================================
Building the UFS SRW App is similar to building NCEPLIBS, in that the code is stored in a git repository and is built using CMake software. The first step is to retrieve the code from GitHub, using the variables defined earlier:

.. code-block:: console
Expand Down Expand Up @@ -212,8 +212,9 @@ Once the data has been staged, setting up your experiment on a platform without
These are the two ``MACHINE`` settings for generic, non-Rocoto-based platforms; you should choose the one most appropriate for your machine. ``MACOS`` has its own setting due to some differences in how command-line utilities function on Darwin-based operating systems.

``LAYOUT_X=2``

``LAYOUT_Y=2``
These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``.
These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``.

``RUN_CMD_UTILS="mpirun -np 4"``
This is the run command for MPI-enabled pre-processing utilities. Depending on your machine and your MPI installation, you may need to use a different command for launching an MPI-enabled executable.
Expand Down
Loading

0 comments on commit 31dab61

Please sign in to comment.