Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update SRW Documentation #212

Merged
merged 80 commits into from
Mar 30, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
80 commits
Select commit Hold shift + click to select a range
370bb6c
updated docs
gspetro Feb 11, 2022
a907487
added git submodule
gspetro Feb 11, 2022
b628dd6
fix formatting
gspetro Feb 11, 2022
467071f
added new submodule commits
gspetro Feb 11, 2022
de00e4c
fixed ref links
gspetro Feb 11, 2022
fb34100
finished Intro
gspetro Feb 11, 2022
701f9e9
finish Components & Intro edits
gspetro Feb 14, 2022
cab1c6f
edited Rocoto workflow section of Quickstart
gspetro Feb 14, 2022
290364e
added minor hpc submodule commits
gspetro Feb 15, 2022
80291a1
Updates to Rocoto Workflow in Quick Start
gspetro Feb 16, 2022
ab97b74
add to HPC-stack intro
gspetro Feb 16, 2022
8056200
submodule updates
gspetro Feb 16, 2022
17504fc
added submodule docs edits
gspetro Feb 17, 2022
357e151
hpc-stack updates & formatting fixes
gspetro Feb 17, 2022
acf555b
hpc-stack intro edits
gspetro Feb 17, 2022
36349a6
bibtex attempted fix
gspetro Feb 18, 2022
838271f
add hpc-stack module edits
gspetro Feb 18, 2022
863b7de
update sphinxcontrib version
gspetro Feb 22, 2022
2b100d9
add .readthedocs.yaml file
gspetro Feb 22, 2022
9e58e67
update .readthedocs.yaml file
gspetro Feb 22, 2022
1830b49
update .readthedocs.yaml file
gspetro Feb 22, 2022
54a647e
update conf.py
gspetro Feb 22, 2022
46d381f
updates .readthedocs.yaml with submodules
gspetro Feb 22, 2022
91af03d
updates .readthedocs.yaml with submodules
gspetro Feb 22, 2022
97616fd
submodule updates
gspetro Feb 22, 2022
21d3e27
submodule updates
gspetro Feb 22, 2022
5af69e5
minor Intro edits
gspetro Feb 23, 2022
ee901e6
minor Intro edits
gspetro Feb 23, 2022
f77cba9
minor Intro edits
gspetro Feb 23, 2022
bc0748c
submodule updates
gspetro Feb 23, 2022
fef6d27
fixed typos in QS
gspetro Feb 23, 2022
0d16101
QS updates
gspetro Feb 24, 2022
418a40b
QS updates
gspetro Feb 24, 2022
77d565d
QS updates
gspetro Feb 25, 2022
2e1a03f
updates to InputOutput and QS
gspetro Feb 25, 2022
80519d4
fix I/O doc typos
gspetro Feb 25, 2022
6f11030
pull updates to hpc-stack docs
gspetro Feb 28, 2022
999a417
pull updates to hpc-stack docs
gspetro Mar 1, 2022
f07fe8a
fix table wrapping
gspetro Mar 1, 2022
14db051
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Mar 3, 2022
b58d661
updates to QS for cloud
gspetro Mar 3, 2022
301ff5f
Merge branch 'develop' of github.com:gspetro-NOAA/ufs-srweather-app i…
gspetro Mar 3, 2022
0b50e04
fix QS export statements
gspetro Mar 3, 2022
8786b32
fix QS export statements
gspetro Mar 3, 2022
3a442d6
QS edits on bind, config
gspetro Mar 3, 2022
0cae160
add bullet points to notes
gspetro Mar 4, 2022
27247a5
running without rocoto
gspetro Mar 4, 2022
805bb81
add HPC-Stack submodule w/docs
gspetro Mar 4, 2022
29cf292
split QS into container/non-container approaches
gspetro Mar 8, 2022
3e30098
added filepath changes for running in container on Orion, et al.
gspetro Mar 9, 2022
53807fa
edits to overview and container QS
gspetro Mar 10, 2022
93bfe9b
moved CodeReposAndDirs.rst info to the Introduction & deleted file
gspetro Mar 11, 2022
eb00397
continued edits to SRWAppOverview
gspetro Mar 11, 2022
f4d2043
combine overview w/non-container docs
gspetro Mar 15, 2022
d1addf8
finish merging non-container guide & SRWOverview, rename/remove files…
gspetro Mar 16, 2022
fc1a1d4
minor edits for Intro & QS
gspetro Mar 17, 2022
acb77c8
updates to BuildRun doc through 3.8.1
gspetro Mar 17, 2022
70a051b
edits to Build/Run and Components
gspetro Mar 17, 2022
99127e7
remove .gitignore
gspetro Mar 18, 2022
b01268d
fix Ch 3 title, 4 supported platform levels note
gspetro Mar 18, 2022
da35184
fix typos, add term links
gspetro Mar 18, 2022
1302868
other minor fixes/suggestions implemented
gspetro Mar 18, 2022
a704a2f
updated Intro based on feedback; changed SRW to SRW App throughout
gspetro Mar 21, 2022
7fc263d
update comment to Intro citation
gspetro Mar 21, 2022
496fcb3
Merge branch 'develop' into develop
gspetro-NOAA Mar 22, 2022
10de71f
add user-defined vertical levels to future work
gspetro Mar 22, 2022
16b0c1a
Merge branch 'develop' of github.com:gspetro-NOAA/ufs-srweather-app i…
gspetro Mar 22, 2022
e294020
Merge branch 'ufs-community:develop' into develop
gspetro-NOAA Mar 23, 2022
92bddca
Add instructions for srw_common module load
gspetro Mar 23, 2022
698613b
Pull changes from upstream
gspetro Mar 23, 2022
6fa5074
fix typo
gspetro Mar 23, 2022
a5ae76e
update Intro & BuildRunSRW based on Mark's feedback
gspetro Mar 23, 2022
ea17b19
minor intro updates
gspetro Mar 23, 2022
1aa9322
1st round of jwolff's edits
gspetro Mar 25, 2022
3d1cddb
2nd round of jwolff updates
gspetro Mar 28, 2022
173b838
update QS intro
gspetro Mar 29, 2022
09581c8
fix minor physics details
gspetro Mar 29, 2022
a714d43
update citation and physics suite name
gspetro Mar 29, 2022
4757b40
add compute node allocation info to QS
gspetro Mar 29, 2022
2c68823
add authoritative hpc-stack docs to Intro
gspetro Mar 29, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[submodule "hpc-stack-mod"]
gspetro-NOAA marked this conversation as resolved.
Show resolved Hide resolved
path = hpc-stack-mod
url = https://github.com/NOAA-EMC/hpc-stack.git
35 changes: 35 additions & 0 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# .readthedocs.yaml
gspetro-NOAA marked this conversation as resolved.
Show resolved Hide resolved
# Read the Docs configuration file
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details

# Required
version: 2

# Set the version of Python and other tools you might need
build:
os: ubuntu-20.04
tools:
python: "3.9"
# You can also specify other tool versions:
# nodejs: "16"
# rust: "1.55"
# golang: "1.17"

# Build documentation in the docs/ directory with Sphinx
sphinx:
configuration: docs/UsersGuide/source/conf.py

# If using Sphinx, optionally build your docs in additional formats such as PDF
# formats:
# - pdf

# Optionally declare the Python requirements required to build your docs
python:
install:
- requirements: docs/UsersGuide/requirements.txt

submodules:
include:
- hpc-stack-mod
recursive: true

4 changes: 0 additions & 4 deletions docs/UsersGuide/build/.gitignore

This file was deleted.

886 changes: 886 additions & 0 deletions docs/UsersGuide/source/BuildRunSRW.rst

Large diffs are not rendered by default.

261 changes: 0 additions & 261 deletions docs/UsersGuide/source/CodeReposAndDirs.rst

This file was deleted.

80 changes: 80 additions & 0 deletions docs/UsersGuide/source/Components.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
.. _Components:

============================
SRW Application Components
============================

The SRW Application assembles a variety of components, including:

* Pre-processor Utilities & Initial Conditions
* UFS Weather Forecast Model
* Unified Post-Processor
* Visualization Examples
* Build System and Workflow

These components are documented within this User's Guide and supported through a `community forum <https://forums.ufscommunity.org/>`_.

.. _Utils:

Pre-processor Utilities and Initial Conditions
==============================================

The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of halo cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software ``chgres_cube`` is used to convert the raw external model data into initial and lateral boundary condition files in netCDF format. These are needed as input to the FV3-LAM. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User's Guide <https://noaa-emcufs-utils.readthedocs.io/en/latest/>`_.

The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates.

.. WARNING::
For GFS data, dates prior to 1 January 2018 may work but are not guaranteed. Public archives of model data can be accessed through the `National Centers for Environmental Information <https://www.ncdc.noaa.gov/data-access/model-data/model-datasets/global-forcast-system-gfs>`_ (NCEI) or through the `NOAA Operational Model Archive and Distribution System <https://nomads.ncep.noaa.gov/>`_ (NOMADS). Raw external model data may be pre-staged on disk by the user.


Forecast Model
==============

The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere
(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2021`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here <https://ufs-weather-model.readthedocs.io/en/latest/>`__.

Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here <https://noaa-emc.github.io/FV3_Dycore_ufs-v2.0.0/html/index.html>`__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website <https://www.gfdl.noaa.gov/fv3/>`_.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the intention to provide full support for the user defined domains now? If so, this text should be modified. It was preliminary for v1.0 release but I think it is probably able to be fully supported now (but leave that up to EPIC to decide for sure).


Interoperable atmospheric physics, along with various land surface model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here <https://dtcenter.org/community-code/common-community-physics-package-ccpp>`__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There will be four physics options supported for the v2.0 release. The first is the FV3_RRFS_v1beta physics suite, which is being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v16. Additionally, FV3_WoFS and FV3_HRRR will be supported. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation <https://dtcenter.ucar.edu/GMTB/v5.0.0/sci_doc/index.html>`_, and CCPP technical aspects are described in the `CCPP Technical Documentation <https://ccpp-techdoc.readthedocs.io/en/v5.0.0/>`_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each of the supported suites.

The SRW App supports the use of both :term:`GRIB2` and :term:`NEMSIO` input data. The UFS Weather Model ingests initial and lateral boundary condition files produced by :term:`chgres_cube` and outputs files in netCDF format on a specific projection (e.g., Lambert Conformal) in the horizontal direction and model levels in the vertical direction.

Post-processor
==============

The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide <https://upp.readthedocs.io/en/latest/>`__.

Output from UPP can be used with visualization, plotting, and verification packages or in
further downstream post-processing (e.g., statistical post-processing techniques).

Visualization Example
=====================
A Python script is provided to create basic visualization of the model output. The script
is designed to output graphics in PNG format for 14 standard meteorological variables
when using the pre-defined :term:`CONUS` domain. A difference plotting script is also included to visually compare two runs for the same domain and resolution. These scripts are provided only as an example for users familiar with Python. They may be used to perform a visual check to verify that the application is producing reasonable results.

After running ``manage_externals/checkout_externals``, the visualization scripts will be available in the ``ufs-srweather-app/regional_workflow/ush/Python`` directory. Usage information and instructions are described in :numref:`Chapter %s <Graphics>` and are also included at the top of the script.

Build System and Workflow
=========================

The SRW Application has a portable build system and a user-friendly, modular, and
expandable workflow framework.

An umbrella CMake-based build system is used for building the components necessary for running the end-to-end SRW Application, including the UFS Weather Model and the pre- and post-processing software. Additional libraries necessary for the application (e.g., :term:`NCEPLIBS-external` and :term:`NCEPLIBS`) are not included in the SRW Application build system but are available pre-built on pre-configured platforms. On other systems, they can be installed via the HPC-Stack (see :numref:`Chapter %s: Installing the HPC-Stack <InstallHPCStack>`). There is a small set of system libraries and utilities that are assumed to be present on the target computer: the CMake build software, a Fortran, C, and C++ compiler, and an :term:`MPI` library.

Once built, the provided experiment generator script can be used to create a Rocoto-based
workflow file that will run each task in the system in the proper sequence (see `Rocoto documentation
<https://github.com/christopherwharrop/rocoto/wiki/Documentation>`_). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, this also just occurred to me - we will need to mention the ensemble capability in the UG as well. I think @JeffBeck-NOAA was going to work on that at some point! (Again, like the vx, that can be added later in a separate PR)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jwolff-ncar, yep, we'll need ensemble and stochastic physics included in the documentation. I added some fairly detailed in-line comments within the original PRs, and those will be a good place to start.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JeffBeck-NOAA Can you link to the PR's and/or list the numbers I should look at?

Copy link
Collaborator

@JeffBeck-NOAA JeffBeck-NOAA Mar 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes! PR #237 and PR #685 for the tendency-based and SPP stochastic physics, respectively. More SPP information is available in the ufs-weather-model-related issue here. Ensemble mode was implemented in PR #245, with a bug fix here and here.

This SRW Application release has been tested on a variety of platforms widely used by
researchers, such as the NOAA Research and Development High-Performance Computing Systems
(RDHPCS), including Hera, Orion, and Jet; NOAA’s Weather and Climate Operational
Supercomputing System (WCOSS); the National Center for Atmospheric Research (NCAR) Cheyenne
system; the National Severe Storms Laboratory (NSSL) HPC machine, Odin; the National Science Foundation Stampede2 system; and generic Linux and macOS systems using Intel and GNU compilers. Four `levels of support <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`_ have been defined for the SRW Application, including pre-configured (Level 1), configurable (Level 2), limited test platforms (Level 3), and build only platforms (Level 4). Each level is further described below.

On pre-configured (Level 1) computational platforms, all the required libraries for building the SRW Application are available in a central place. That means bundled libraries (NCEPLIBS) and third-party libraries (NCEPLIBS-external) have both been built. The SRW Application is expected to build and run out-of-the-box on these pre-configured platforms.

A few additional computational platforms are considered configurable for the SRW Application release. Configurable platforms (Level 2) are platforms where all of the required libraries for building the SRW Application are expected to install successfully but are not available in a central location. Applications and models are expected to build and run once the required bundled libraries (e.g., NCEPLIBS) and third-party libraries (e.g., NCEPLIBS-external) are built.

Limited-Test (Level 3) and Build-Only (Level 4) computational platforms are those in which the developers have built the code but little or no pre-release testing has been conducted, respectively. A complete description of the levels of support, along with a list of preconfigured and configurable platforms can be found in the `SRW Application wiki page <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`_.
11 changes: 6 additions & 5 deletions docs/UsersGuide/source/ConfigNewPlatform.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
Configuring a New Platform
==========================

The UFS SRW Application has been designed to work primarily on a number of Level 1 and 2 support platforms, as specified `here <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`_. However, it is also designed with flexibility in mind, so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. A full list of prerequisites for installing the UFS SRW App and running the Graduate Student Test can be found in :numref:`Section %s <SW-OS-Requirements>`.
The UFS SRW Application has been designed to work primarily on a number of Level 1 and 2 support platforms, as specified `here <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__. However, it is also designed with flexibility in mind, so that any sufficiently up-to-date machine with a UNIX-based operating system should be capable of running the application. A full list of prerequisites for installing the UFS SRW App and running the Graduate Student Test can be found in :numref:`Section %s <SW-OS-Requirements>`.

The first step to installing on a new machine is to install :term:`NCEPLIBS` (https://github.com/NOAA-EMC/NCEPLIBS), the NCEP libraries package, which is a set of libraries created and maintained by NCEP and EMC that are used in many parts of the UFS. NCEPLIBS comes with a large number of prerequisites (see :numref:`Section %s <SW-OS-Requirements>` for more info), but the only required software prior to starting the installation process are as follows:

Expand Down Expand Up @@ -57,7 +57,7 @@ However, it is also possible to install these utilities via Macports (https://ww

Installing NCEPLIBS-external
============================
In order to facilitate the installation of NCEPLIBS (and therefore, the SRW and other UFS applications) on new platforms, EMC maintains a one-stop package containing most of the prerequisite libraries and software necessary for installing NCEPLIBS. This package is known as NCEPLIBS-external, and is maintained in a git repository at https://github.com/NOAA-EMC/NCEPLIBS-external. Instructions for installing these will depend on your platform, but generally so long as all the above-mentioned prerequisites have been installed you can follow the proceeding instructions verbatim (in bash; a csh-based shell will require different commands). Some examples for installing on specific platforms can be found in the `NCEPLIBS-external/doc directory <https://github.com/NOAA-EMC/NCEPLIBS-external/tree/release/public-v2/doc>`.
In order to facilitate the installation of NCEPLIBS (and therefore, the SRW App and other UFS applications) on new platforms, EMC maintains a one-stop package containing most of the prerequisite libraries and software necessary for installing NCEPLIBS. This package is known as NCEPLIBS-external, and is maintained in a git repository at https://github.com/NOAA-EMC/NCEPLIBS-external. Instructions for installing these will depend on your platform, but generally so long as all the above-mentioned prerequisites have been installed you can follow the proceeding instructions verbatim (in bash; a csh-based shell will require different commands). Some examples for installing on specific platforms can be found in the `NCEPLIBS-external/doc directory <https://github.com/NOAA-EMC/NCEPLIBS-external/tree/release/public-v2/doc>`.


These instructions will install the NCEPLIBS-external in the current directory tree, so be sure you are in the desired location before starting.
Expand Down Expand Up @@ -126,8 +126,8 @@ Further information on including prerequisite libraries, as well as other helpfu

Once the NCEPLIBS package has been successfully installed, you can move on to building the UFS SRW Application.

Building the UFS Short-Range Weather Application (UFS SRW App)
==============================================================
Building the UFS SRW Application
=======================================
Building the UFS SRW App is similar to building NCEPLIBS, in that the code is stored in a git repository and is built using CMake software. The first step is to retrieve the code from GitHub, using the variables defined earlier:

.. code-block:: console
Expand Down Expand Up @@ -212,8 +212,9 @@ Once the data has been staged, setting up your experiment on a platform without
These are the two ``MACHINE`` settings for generic, non-Rocoto-based platforms; you should choose the one most appropriate for your machine. ``MACOS`` has its own setting due to some differences in how command-line utilities function on Darwin-based operating systems.

``LAYOUT_X=2``

``LAYOUT_Y=2``
These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``.
These are the settings that control the MPI decomposition when running the weather model. There are default values, but for your machine it is recommended that you specify your own layout to achieve the correct number of MPI processes for your application. In total, your machine should be able to handle ``LAYOUT_X×LAYOUT_Y+WRTCMP_write_tasks_per_group`` tasks. ``WRTCMP_write_tasks_per_group`` is the number of MPI tasks that will be set aside for writing model output, and it is a setting dependent on the domain you have selected. You can find and edit the value of this variable in the file ``regional_workflow/ush/set_predef_grid_params.sh``.

``RUN_CMD_UTILS="mpirun -np 4"``
This is the run command for MPI-enabled pre-processing utilities. Depending on your machine and your MPI installation, you may need to use a different command for launching an MPI-enabled executable.
Expand Down
Loading