diff --git a/README.md b/README.md index 1ec8497218..1e5abbdf13 100644 --- a/README.md +++ b/README.md @@ -11,4 +11,4 @@ The UFS SRW App User's Guide associated with the development branch is at: https For instructions on how to clone the repository, build the code, and run the workflow, see: https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started -UFS Development Team. (2022, June 17). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.6505854 +UFS Development Team. (2022, June 22). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.6505854 diff --git a/docs/UsersGuide/source/BuildRunSRW.rst b/docs/UsersGuide/source/BuildRunSRW.rst index 7f04b51bfe..995b3aa4cb 100644 --- a/docs/UsersGuide/source/BuildRunSRW.rst +++ b/docs/UsersGuide/source/BuildRunSRW.rst @@ -159,6 +159,9 @@ On Level 1 systems for which a modulefile is provided under the ``modulefiles`` where ```` is replaced with the name of the platform the user is working on. Valid values are: ``cheyenne`` | ``gaea`` | ``hera`` | ``jet`` | ``linux`` | ``macos`` | ``noaacloud`` | ``odin`` | ``orion`` | ``singularity`` | ``wcoss_dell_p3`` +.. note:: + Although build modulefiles exist for generic Linux and MacOS machines, users will need to alter these according to the instructions in Sections :numref:`%s ` & :numref:`%s `. It is recommended that users on these systems build the SRW App with the :ref:`CMake Approach ` instead. + If compiler auto-detection fails for some reason, specify it using the ``--compiler`` argument. For example: .. code-block:: console @@ -239,7 +242,7 @@ Set Up the Build Environment .. attention:: * If users successfully built the executables in :numref:`Table %s `, they should skip to step :numref:`Step %s `. - * Users who want to build the SRW App on a generic MacOS should skip to :numref:`Step %s ` and follow the approach there. + * Users who want to build the SRW App on a generic MacOS should skip to :numref:`Section %s ` and follow the approach there. If the ``devbuild.sh`` approach failed, users need to set up their environment to run a workflow on their specific platform. First, users should make sure ``Lmod`` is the app used for loading modulefiles. This is the case on most Level 1 systems; however, on systems such as Gaea/Odin, the default modulefile loader is from Cray and must be switched to Lmod. For example, on Gaea, users can run one of the following two commands depending on whether they have a bash or csh shell, respectively: @@ -259,7 +262,7 @@ From here on, ``Lmod`` is ready to load the modulefiles needed by the SRW App. T where ```` is the full path to the ``modulefiles`` directory. -This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. On Level 2-4 systems, users will need to modify certain environment variables, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands, depending on whether they are using a bash or csh/tcsh shell, respectively: +This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. On Level 2-4 systems (including generic Linux/MacOS systems), users will need to modify certain environment variables, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build__`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands, depending on whether they are using a bash or csh/tcsh shell, respectively: .. code-block:: @@ -273,7 +276,7 @@ Note that building the SRW App without Lmod is not supported for this release. I Build the Executables Using CMake ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -After setting up the build environment in the preceding section, users need to build the executables required to run the SRW App. In the ``ufs-srweather-app`` directory, create a subdirectory to hold the build's executables: +After setting up the build environment in the preceding section (by loading the ``build__`` modulefile), users need to build the executables required to run the SRW App. In the ``ufs-srweather-app`` directory, create a subdirectory to hold the build's executables: .. code-block:: console @@ -390,11 +393,11 @@ Generate the Forecast Experiment ================================= Generating the forecast experiment requires three steps: -* :ref:`Set experiment parameters ` -* :ref:`Set Python and other environment parameters ` -* :ref:`Run a script to generate the experiment workflow ` +#. :ref:`Set experiment parameters ` +#. :ref:`Set Python and other environment parameters ` +#. :ref:`Run a script to generate the experiment workflow ` -The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to their machine if they are working on a Level 2-4 platform. Information in :numref:`Chapter %s: Configuring the Workflow ` can help with this. +The first two steps depend on the platform being used and are described here for each Level 1 platform. Users will need to adjust the instructions to reflect their machine configuration if they are working on a Level 2-4 platform. Information in :numref:`Chapter %s: Configuring the Workflow ` can help with this. .. _ExptConfig: @@ -403,23 +406,15 @@ Set Experiment Parameters Each experiment requires certain basic information to run (e.g., date, grid, physics suite). This information is specified in ``config_defaults.sh`` and in the user-specified ``config.sh`` file. When generating a new experiment, the SRW App first reads and assigns default values from the ``config_defaults.sh`` file. Then, it reads and (re)assigns variables from the user's custom ``config.sh`` file. -Section Overview: - - * For background information on ``config_defaults.sh``, read :numref:`Section %s `. - * Jump to :numref:`Section %s ` to continue configuring the experiment. - * Go to :numref:`Section %s ` for additional details on configuring an experiment on a generic Linux or MacOS system. - - .. _DefaultConfigSection: Default configuration: ``config_defaults.sh`` ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ .. note:: - This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. This information is informative, but users do not need to modify ``config_defaults.sh`` to run the out-of-the-box case for the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. + This section provides background information on how the SRW App uses the ``config_defaults.sh`` file. It is informative, but users do not need to modify ``config_defaults.sh`` to run the out-of-the-box case for the SRW App. Users may skip to :numref:`Step %s ` to continue configuring their experiment. -Important configuration variables in the ``config_defaults.sh`` file appear in -:numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the ``config_defaults.sh`` +Configuration variables in the ``config_defaults.sh`` file appear in :numref:`Table %s `. Some of these default values are intentionally invalid in order to ensure that the user assigns valid values in the user-specified ``config.sh`` file. Any settings provided in ``config.sh`` will override the ``config_defaults.sh`` settings. There is usually no need for a user to modify the default configuration file. Additional information on the default settings can be found in the file itself and in :numref:`Chapter %s `. .. _ConfigVarsDefault: @@ -431,9 +426,9 @@ settings. There is usually no need for a user to modify the default configuratio +======================+==============================================================+ | Experiment mode | RUN_ENVIR | +----------------------+--------------------------------------------------------------+ - | Machine and queue | MACHINE, MACHINE_FILE, ACCOUNT, COMPILER | - | | NCORES_PER_NODE, LMOD_PATH, BUILD_MOD_FN, WFLOW_MOD_FN, | - | | SCHED, PARTITION_DEFAULT, CLUSTERS_DEFAULT, QUEUE_DEFAULT, | + | Machine and queue | MACHINE, MACHINE_FILE, ACCOUNT, COMPILER, SCHED, | + | | LMOD_PATH, NCORES_PER_NODE, BUILD_MOD_FN, WFLOW_MOD_FN, | + | | PARTITION_DEFAULT, CLUSTERS_DEFAULT, QUEUE_DEFAULT, | | | PARTITION_HPSS, CLUSTERS_HPSS, QUEUE_HPSS, PARTITION_FCST, | | | CLUSTERS_FCST, QUEUE_FCST | +----------------------+--------------------------------------------------------------+ @@ -512,7 +507,7 @@ settings. There is usually no need for a user to modify the default configuratio | | OROG_DIR, RUN_TASK_MAKE_SFC_CLIMO, SFC_CLIMO_DIR | +----------------------+--------------------------------------------------------------+ | Cycle dependent | RUN_TASK_GET_EXTRN_ICS, RUN_TASK_GET_EXTRN_LBCS, | - | | RUN_TASK_MAKE_ICS, RUN_TASK_MAKE_LBCS, RUN_TASK_RUN_FCST | + | | RUN_TASK_MAKE_ICS, RUN_TASK_MAKE_LBCS, RUN_TASK_RUN_FCST, | | | RUN_TASK_RUN_POST | +----------------------+--------------------------------------------------------------+ | VX run tasks | RUN_TASK_GET_OBS_CCPA, RUN_TASK_GET_OBS_MRMS, | @@ -520,9 +515,9 @@ settings. There is usually no need for a user to modify the default configuratio | | RUN_TASK_VX_POINTSTAT, RUN_TASK_VX_ENSGRID, | | | RUN_TASK_VX_ENSPOINT | +----------------------+--------------------------------------------------------------+ - | Surface climatology | SFC_CLIMO_FIELDS, FIXgsm, TOPO_DIR, SFC_CLIMO_INPUT_DIR, | - | | FNGLAC, FNMXIC, FNTSFC, FNSNOC, FNZORC, FNAISC, FNSMCC, | - | | FNMSKH, FIXgsm_FILES_TO_COPY_TO_FIXam, | + | Fixed File Parameters| FIXgsm, FIXaer, FIXlut, TOPO_DIR, SFC_CLIMO_INPUT_DIR, | + | | FNGLAC, FNMXIC, FNTSFC, FNSNOC, FNZORC, | + | | FNAISC, FNSMCC, FNMSKH, FIXgsm_FILES_TO_COPY_TO_FIXam, | | | FV3_NML_VARNAME_TO_FIXam_FILES_MAPPING, | | | FV3_NML_VARNAME_TO_SFC_CLIMO_FIELD_MAPPING, | | | CYCLEDIR_LINKS_TO_FIXam_FILES_MAPPING | @@ -570,7 +565,7 @@ settings. There is usually no need for a user to modify the default configuratio | Maximum attempt | MAXTRIES_MAKE_GRID, MAXTRIES_MAKE_OROG, | | | MAXTRIES_MAKE_SFC_CLIMO, MAXTRIES_GET_EXTRN_ICS, | | | MAXTRIES_GET_EXTRN_LBCS, MAXTRIES_MAKE_ICS, | - | | MAXTRIES_MAKE_LBCS, MAXTRIES_RUN_FCST, MAXTRIES_RUN_POST | + | | MAXTRIES_MAKE_LBCS, MAXTRIES_RUN_FCST, MAXTRIES_RUN_POST, | | | MAXTRIES_GET_OBS_CCPA, MAXTRIES_GET_OBS_MRMS, | | | MAXTRIES_GET_OBS_NDAS, MAXTRIES_VX_GRIDSTAT, | | | MAXTRIES_VX_GRIDSTAT_REFC, MAXTRIES_VX_GRIDSTAT_RETOP, | @@ -583,9 +578,7 @@ settings. There is usually no need for a user to modify the default configuratio | | MAXTRIES_VX_ENSGRID_PROB_RETOP, MAXTRIES_VX_ENSPOINT, | | | MAXTRIES_VX_ENSPOINT_MEAN, MAXTRIES_VX_ENSPOINT_PROB | +----------------------+--------------------------------------------------------------+ - | Aerosol climatology | USE_MERRA_CLIMO, FIXaer | - +----------------------+--------------------------------------------------------------+ - | Fixed file params | FIXlut | + | Climatology | SFC_CLIMO_FIELDS, USE_MERRA_CLIMO | +----------------------+--------------------------------------------------------------+ | CRTM | USE_CRTM, CRTM_DIR | +----------------------+--------------------------------------------------------------+ @@ -716,17 +709,13 @@ To get started, make a copy of ``config.community.sh``. From the ``ufs-srweather The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. -Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. +Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. If the modulefile used to set up the build environment in :numref:`Section %s ` uses a GNU compiler, check that the line ``COMPILER="gnu"`` appears in the ``config.sh`` file. .. note:: - Generic Linux and MacOS users should refer to :numref:`Section %s ` for details on configuring an experiment and python environment. - -Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. - -.. important:: + Generic Linux and MacOS users should refer to :numref:`Section %s ` for additional details on configuring an experiment and python environment. - If your modulefile uses a GNU compiler to set up the build environment in :numref:`Section %s `, you will have to check that the line ``COMPILER="gnu"`` appears in the ``config.sh`` file. +Sample ``config.sh`` settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow `, which discusses each variable and the options available. Additionally, information about the four predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids `. .. hint:: @@ -748,9 +737,11 @@ Minimum parameter settings for running the out-of-the-box SRW App case on Level EXTRN_MDL_SOURCE_BASEDIR_LBCS="/glade/p/ral/jntp/UFS_SRW_App/v2p0/input_model_data///" where: - * ```` refers to a subdirectory such as "FV3GFS" or "HRRR" containing the experiment data. + * ```` refers to a valid account name. + * ```` is an experiment name of the user's choice. + * ```` refers to a subdirectory, such as "FV3GFS" or "HRRR", containing the experiment data. * ```` refers to one of 3 possible data formats: ``grib2``, ``nemsio``, or ``netcdf``. - * ``YYYYMMDDHH`` refers to a subdirectory containing data for the :term:`cycle` date. + * ```` refers to a subdirectory containing data for the :term:`cycle` date (in YYYYMMDDHH format). **Hera, Jet, Orion, Gaea:** @@ -761,25 +752,25 @@ On Hera: .. code-block:: console - "/scratch2/BMC/det/UFS_SRW_App/v2p0/input_model_data///YYYYMMDDHH/" + "/scratch2/BMC/det/UFS_SRW_App/v2p0/input_model_data////" On Jet: .. code-block:: console - "/mnt/lfs4/BMC/wrfruc/UFS_SRW_App/v2p0/input_model_data///YYYYMMDDHH/" + "/mnt/lfs4/BMC/wrfruc/UFS_SRW_App/v2p0/input_model_data////" On Orion: .. code-block:: console - "/work/noaa/fv3-cam/UFS_SRW_App/v2p0/input_model_data///YYYYMMDDHH/" + "/work/noaa/fv3-cam/UFS_SRW_App/v2p0/input_model_data////" On Gaea: .. code-block:: console - "/lustre/f2/pdata/ncep/UFS_SRW_App/v2p0/input_model_data///YYYYMMDDHH/" + "/lustre/f2/pdata/ncep/UFS_SRW_App/v2p0/input_model_data////" On NOAA Cloud Systems: @@ -787,20 +778,20 @@ On NOAA Cloud Systems: MACHINE="NOAACLOUD" ACCOUNT="none" - EXPT_SUBDIR="" + EXPT_SUBDIR="" USE_USER_STAGED_EXTRN_FILES="TRUE" - EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/EPIC/UFS_SRW_App/v2p0/input_model_data/FV3GFS/grib2/YYYYMMDDHH/" + EXTRN_MDL_SOURCE_BASEDIR_ICS="/contrib/EPIC/UFS_SRW_App/v2p0/input_model_data////" EXTRN_MDL_FILES_ICS=( "gfs.t18z.pgrb2.0p25.f000" ) - EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/EPIC/UFS_SRW_App/v2p0/input_model_data/FV3GFS/grib2/YYYYMMDDHH/" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="/contrib/EPIC/UFS_SRW_App/v2p0/input_model_data////" EXTRN_MDL_FILES_LBCS=( "gfs.t18z.pgrb2.0p25.f006" "gfs.t18z.pgrb2.0p25.f012" ) .. note:: The values of the configuration variables should be consistent with those in the - ``valid_param_vals script``. In addition, various example configuration files can be found in the ``regional_workflow/tests/baseline_configs`` directory. + ``valid_param_vals.sh`` script. In addition, various sample configuration files can be found in the ``regional_workflow/tests/baseline_configs`` directory. -To configure an experiment and python environment for a general Linux or Mac system, see the :ref:`next section `. Otherwise, skip to :numref:`Section %s `. +To configure an experiment and python environment for a general Linux or Mac system, see the :ref:`next section `. To configure an experiment to run METplus verification tasks, see :numref:`Section %s `. Otherwise, skip to :numref:`Section %s `. .. _LinuxMacEnvConfig: @@ -851,7 +842,6 @@ Users must create a virtual environment (``regional_workflow``), store it in the The virtual environment can be deactivated by running the ``deactivate`` command. The virtual environment built here will be reactivated in :numref:`Step %s ` and needs to be used to generate the workflow and run the experiment. - .. _LinuxMacExptConfig: Configuring an Experiment on General Linux and MacOS Systems @@ -860,12 +850,19 @@ Configuring an Experiment on General Linux and MacOS Systems **Optional: Install Rocoto** .. note:: - Users may `install Rocoto `__ if they want to make use of a workflow manager to run their experiments. However, this option has not been tested yet on MacOS and had limited testing on general Linux plaforms. + Users may `install Rocoto `__ if they want to make use of a workflow manager to run their experiments. However, this option has not been tested yet on MacOS and has had limited testing on general Linux plaforms. **Configure the SRW App:** -Configure an experiment using a template. Copy a ``config.community.sh`` file in the ``$SRW/regional_workflow/ush`` directory into ``config.sh`` file (see :numref:`Section %s `) above. In the ``config.sh`` file, set ``MACHINE="macos"`` or ``MACHINE="linux"``, and modify account and experiment info. For example: +Configure an experiment using a template. Copy the contents of ``config.community.sh`` into ``config.sh``: + +.. code-block:: console + + cd $SRW/regional_workflow/ush + cp config.community.sh config.sh + +In the ``config.sh`` file, set ``MACHINE="macos"`` or ``MACHINE="linux"``, and modify the account and experiment info. For example: .. code-block:: console @@ -880,7 +877,7 @@ Configure an experiment using a template. Copy a ``config.community.sh`` file in PREDEF_GRID_NAME="RRFS_CONUS_25km" QUILTING="TRUE" -Due to the limited number of processors on Mac OS systems, users must configure the domain decomposition defaults (usually, there are only 8 CPUs in M1-family chips and 4 CPUs for x86_64). +Due to the limited number of processors on MacOS systems, users must also configure the domain decomposition defaults (usually, there are only 8 CPUs in M1-family chips and 4 CPUs for x86_64). For :ref:`Option 1 `, add the following information to ``config.sh``: @@ -901,11 +898,11 @@ For :ref:`Option 2 `, add the following information to ``config.sh`` WRTCMP_write_tasks_per_group="1" .. note:: - The number of MPI processes required by the forecast will be equal to LAYOUT_X * LAYOUT_Y + WRTCMP_write_tasks_per_group. + The number of MPI processes required by the forecast will be equal to ``LAYOUT_X`` * ``LAYOUT_Y`` + ``WRTCMP_write_tasks_per_group``. **Configure the Machine File** -Configure a ``macos.sh`` or ``linux.sh`` machine file in ``$SRW/regional_workflow/ush/machine/`` based on the number of CPUs in the system (8 or 4 in MacOS), or a given number for Linux systems, ````. Job scheduler, ```` options are ``none``, ``slurm``, or another scheduler used by the system. +Configure a ``macos.sh`` or ``linux.sh`` machine file in ``$SRW/regional_workflow/ush/machine/`` based on the number of CPUs (````) in the system (usually 8 or 4 in MacOS; varies on Linux systems). Job scheduler (``SCHED``) options can be viewed :ref:`here `. Users must also set the path to the fix file directories. .. code-block:: console @@ -921,10 +918,10 @@ Configure a ``macos.sh`` or ``linux.sh`` machine file in ``$SRW/regional_workflo FIXgsm="path/to/FIXgsm/files" FIXaer="path/to/FIXaer/files" FIXlut="path/to/FIXlut/files" - TOPO_DIR="path/to/FIXgsm/files" # (path to location of static input files - used by the ``make_orog`` task) + TOPO_DIR="path/to/FIXgsm/files" # (path to location of static input files used by the + make_orog task) SFC_CLIMO_INPUT_DIR="path/to/FIXgsm/files" # (path to location of static surface climatology - input fields used by ``sfc_climo_gen``) + input fields used by sfc_climo_gen) # Run commands for executables RUN_CMD_SERIAL="time" @@ -941,7 +938,7 @@ Configure METplus Verification Suite (Optional) Users who want to use the METplus verification suite to evaluate their forecasts need to add additional information to their ``config.sh`` file. Other users may skip to the :ref:`next section `. .. attention:: - METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems. For the v2 release, METplus *use* is supported on systems with a functioning METplus installation, although installation itself is not supported. For more information about METplus, see :numref:`Section %s `. + METplus *installation* is not included as part of the build process for this release of the SRW App. However, METplus is preinstalled on many `Level 1 & 2 `__ systems. For the v2.0.0 release, METplus *use* is supported on systems with a functioning METplus installation, although installation itself is not supported. For more information about METplus, see :numref:`Section %s `. .. note:: If METplus users update their METplus installation, they must update the module load statements in ``ufs-srweather-app/regional_workflow/modulefiles/tasks//run_vx.local`` file to correspond to their system's updated installation: @@ -969,7 +966,7 @@ Users who have already staged the observation data needed for METplus (i.e., the RUN_TASK_GET_OBS_MRMS="FALSE" RUN_TASK_GET_OBS_NDAS="FALSE" -If users have access to NOAA HPSS but have not pre-staged the data, they can simply set the ``RUN_TASK_GET_OBS_*`` tasks to "TRUE", and the machine will attempt to download the appropriate data from NOAA HPSS. The ``*_OBS_DIR`` paths must be set to the location where users want the downloaded data to reside. +If users have access to NOAA :term:`HPSS` but have not pre-staged the data, they can simply set the ``RUN_TASK_GET_OBS_*`` tasks to "TRUE", and the machine will attempt to download the appropriate data from NOAA HPSS. The ``*_OBS_DIR`` paths must be set to the location where users want the downloaded data to reside. Users who do not have access to NOAA HPSS and do not have the data on their system will need to download :term:`CCPA`, :term:`MRMS`, and :term:`NDAS` data manually from collections of publicly available data, such as the ones listed `here `__. @@ -982,14 +979,14 @@ Next, the verification tasks must be turned on according to the user's needs. Us RUN_TASK_VX_ENSGRID="TRUE" RUN_TASK_VX_ENSPOINT="TRUE" -These tasks are independent, so users may set some values to "TRUE" and others to "FALSE" depending on the needs of their experiment. Note that the ENSGRID and ENSPOINT tasks apply only to ensemble model verification. Additional verification tasks appear in :numref:`Table %s ` More details on all of the parameters in this section are available in :numref:`Chapter %s `. +These tasks are independent, so users may set some values to "TRUE" and others to "FALSE" depending on the needs of their experiment. Note that the ENSGRID and ENSPOINT tasks apply only to ensemble model verification. Additional verification tasks appear in :numref:`Table %s `. More details on all of the parameters in this section are available in :numref:`Section %s `. .. _SetUpPythonEnv: Set Up the Python and Other Environment Parameters ---------------------------------------------------- -The workflow requires Python 3 with the packages ``PyYAML``, ``Jinja2``, and ``f90nml`` available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): +The workflow requires Python 3 with the packages ``PyYAML``, ``Jinja2``, and ``f90nml`` available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way: .. code-block:: console @@ -1011,21 +1008,27 @@ then the user should run ``conda activate regional_workflow``. This will activat source ~/.bashrc conda activate regional_workflow - .. _LinuxMacActivateWFenv: -Activate the Workflow Environment on General MacOS/Linux Systems -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ +Activating the Workflow Environment on Non-Level 1 Systems +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The ``regional_workflow`` environment can be activated as follows for ``="macos"``, or ``"=linux"``: +Users on non-Level 1 systems can copy one of the provided ``wflow_`` files and use it as a template to create a ``wflow_`` file that works for their system. ``wflow_macos`` and ``wflow_linux`` template files are provided with the release. After making appropriate modifications to a ``wflow_`` file, users can run the commands from :numref:`Step %s ` above to activate the regional workflow. + +On generic Linux or MacOS systems, loading the designated ``wflow_`` file will output instructions similar to the following: .. code-block:: console - cd $SRW/regional_workflow/ush - module load wflow_ + Please do the following to activate conda: + > source $VENV/bin/activate + +If that does not work, users can also try: + +.. code-block:: console -This should activate the ``regional_workflow`` environment created in :numref:`Step %s `. From here, the user may continue to :numref:`Section %s ` to generate the regional workflow. + source $HOME/venv/regional_workflow/bin/activate +However, it may instead be necessary to make additional adjustments to the ``wflow_`` file. .. _GenerateWorkflow: @@ -1038,15 +1041,13 @@ Run the following command from the ``ufs-srweather-app/regional_workflow/ush`` d ./generate_FV3LAM_wflow.sh -The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow. - -When not using Rocoto on Linux or MacOS systems, the experiment could be launched using stand-alone scripts as :numref:`Section %s `. +The last line of output from this script, starting with ``*/1 * * * *`` or ``*/3 * * * *``, can be saved and :ref:`used later ` to automatically run portions of the workflow if users have the Rocoto workflow manager installed on their system. -This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. First, ``generate_FV3LAM_wflow.sh`` runs the ``setup.sh`` script to set the configuration parameters. Second, it copies the time-independent (fix) files and other necessary data input files from their location in the ufs-weather-model directory to the experiment directory (``EXPTDIR``). Third, it copies the weather model executable (``ufs_model``) from the ``bin`` directory to ``EXPTDIR`` and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. +This workflow generation script creates an experiment directory and populates it with all the data needed to run through the workflow. The flowchart in :numref:`Figure %s ` describes the experiment generation process. First, ``generate_FV3LAM_wflow.sh`` runs the ``setup.sh`` script to set the configuration parameters. Second, it copies the time-independent (fix) files and other necessary data input files from their location in the ufs-weather-model directory to the experiment directory (``$EXPTDIR``). Third, it copies the weather model executable (``ufs_model``) from the ``bin`` directory to ``$EXPTDIR`` and creates the input namelist file ``input.nml`` based on the ``input.nml.FV3`` file in the regional_workflow/ush/templates directory. Lastly, it creates the workflow XML file ``FV3LAM_wflow.xml`` that is executed when running the experiment with the Rocoto workflow manager. -The ``setup.sh`` script reads three other configuration scripts in order: (1) ``config_default.sh`` (:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), and (3) ``set_predef_grid_params.sh`` (:numref:`Section %s `). If a parameter is specified differently in these scripts, the file containing the last defined value will be used. +The ``setup.sh`` script reads three other configuration scripts in order: (1) ``config_default.sh`` (:numref:`Section %s `), (2) ``config.sh`` (:numref:`Section %s `), and (3) ``set_predef_grid_params.sh``. If a parameter is specified differently in these scripts, the file containing the last defined value will be used. -The generated workflow will appear in ``EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in ``$EXPTDIR``. +The generated workflow will appear in ``$EXPTDIR``, where ``EXPTDIR=${EXPT_BASEDIR}/${EXPT_SUBDIR}``. These variables were specified in the ``config.sh`` file in :numref:`Step %s `. The settings for these paths can also be viewed in the console output from the ``./generate_FV3LAM_wflow.sh`` script or in the ``log.generate_FV3LAM_wflow`` file, which can be found in ``$EXPTDIR``. .. _WorkflowGeneration: @@ -1104,11 +1105,11 @@ The ``FV3LAM_wflow.xml`` file runs the specific j-job scripts (``regional_workfl | | initial conditions | +----------------------+------------------------------------------------------------+ | get_extrn_lbcs | Cycle-specific task to obtain external data for the | - | | lateral boundary conditions (LBC's) | + | | lateral boundary conditions (LBCs) | +----------------------+------------------------------------------------------------+ | make_ics | Generate initial conditions from the external data | +----------------------+------------------------------------------------------------+ - | make_lbcs | Generate LBC's from the external data | + | make_lbcs | Generate LBCs from the external data | +----------------------+------------------------------------------------------------+ | run_fcst | Run the forecast model (UFS weather model) | +----------------------+------------------------------------------------------------+ @@ -1126,7 +1127,7 @@ In addition to the baseline tasks described in :numref:`Table %s `__ systems, such as MacOS), they should follow the process outlined in :numref:`Section %s ` instead of the instructions in this section. + If users are running the SRW App in a container or on a system that does not have Rocoto installed (e.g., `Level 3 & 4 `__ systems, such as MacOS or generic Linux systems), they should follow the process outlined in :numref:`Section %s ` instead of the instructions in this section. + +The information in this section assumes that Rocoto is available on the desired platform. All official HPC platforms for the UFS SRW App release make use of the Rocoto workflow management software for running experiments. However, Rocoto cannot be used when running the workflow within a container. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts according to the process outlined in :numref:`Section %s `. -The information in this section assumes that Rocoto is available on the desired platform. All official HPC platforms for the UFS SRW App release make use of the Rocoto workflow management software for running experiments. However, Rocoto cannot be used when running the workflow within a container. If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts according to the process outlined in :numref:`Section %s `. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. +There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. .. note:: Users may find it helpful to review :numref:`Chapter %s ` to gain a better understanding of Rocoto commands and workflow management before continuing, but this is not required to run the experiment. @@ -1252,7 +1255,7 @@ To run Rocoto using the ``launch_FV3LAM_wflow.sh`` script provided, simply call cd $EXPTDIR ./launch_FV3LAM_wflow.sh -This script creates a log file named ``log.launch_FV3LAM_wflow`` in ``$EXPTDIR`` or appends information to it if the file already exists. The launch script also creates the ``log/FV3LAM_wflow.log`` file, which shows Rocoto task information. Check the end of the log files periodically to see how the experiment is progressing: +This script creates a log file named ``log.launch_FV3LAM_wflow`` in ``$EXPTDIR`` or appends information to the file if it already exists. The launch script also creates the ``log/FV3LAM_wflow.log`` file, which shows Rocoto task information. Check the end of the log file periodically to see how the experiment is progressing: .. code-block:: console @@ -1313,9 +1316,9 @@ The workflow run is complete when all tasks have "SUCCEEDED". If everything goes 201906150000 run_post_f000 4953244 SUCCEEDED 0 1 5.0 201906150000 run_post_f001 4953245 SUCCEEDED 0 1 4.0 ... - 201906150000 run_post_f048 4953381 SUCCEEDED 0 1 7.0 + 201906150000 run_post_f012 4953381 SUCCEEDED 0 1 7.0 -If users choose to run METplus verification tasks as part of their experiment, the output above will include additional lines after ``run_post_f048``. The output will resemble the following but may be significantly longer when using ensemble verification: +If users choose to run METplus verification tasks as part of their experiment, the output above will include additional lines after ``run_post_f012``. The output will resemble the following but may be significantly longer when using ensemble verification: .. code-block:: console @@ -1323,7 +1326,7 @@ If users choose to run METplus verification tasks as part of their experiment, t ========================================================================================================== 201906150000 make_grid 30466134 SUCCEEDED 0 1 5.0 ... - 201906150000 run_post_f048 30468271 SUCCEEDED 0 1 7.0 + 201906150000 run_post_f012 30468271 SUCCEEDED 0 1 7.0 201906150000 run_gridstatvx 30468420 SUCCEEDED 0 1 53.0 201906150000 run_gridstatvx_refc 30468421 SUCCEEDED 0 1 934.0 201906150000 run_gridstatvx_retop 30468422 SUCCEEDED 0 1 1002.0 @@ -1378,7 +1381,7 @@ If the experiment fails, the ``rocotostat`` command will indicate which task fai Automated Option ---------------------- -For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add the following commands to their ``config.sh`` file: +For automatic resubmission of the workflow at regular intervals (e.g., every minute), the user can add the following commands to their ``config.sh`` file *before* generating the experiment: .. code-block:: console @@ -1390,14 +1393,14 @@ Alternatively, the user can add a crontab entry using the ``crontab -e`` command .. code-block:: console - */3 * * * * cd && /apps/rocoto/1.3.3/bin/rocotorun -w FV3LAM_wflow.xml -d FV3LAM_wflow.db -v 10 + */3 * * * * cd && ./launch_FV3LAM_wflow.sh called_from_cron="TRUE" -where ```` is changed to correspond to the user's ``$EXPTDIR``, and ``/apps/rocoto/1.3.3/bin/rocotorun`` corresponds to the location of the ``rocotorun`` command on the user's system. The number ``3`` can be changed to a different positive integer and simply means that the workflow will be resubmitted every three minutes. +where ```` is changed to correspond to the user's ``$EXPTDIR``. The number ``3`` can be changed to a different positive integer and simply means that the workflow will be resubmitted every three minutes. .. hint:: * On NOAA Cloud instances, ``*/1 * * * *`` is the preferred option for cron jobs because compute nodes will shut down if they remain idle too long. If the compute node shuts down, it can take 15-20 minutes to start up a new one. - * On other NOAA HPC systems, admins discourage the ``*/1 * * * *`` due to load problems. ``*/3 * * * *`` is the preferred option for cron jobs on non-Cloud systems. + * On other NOAA HPC systems, admins discourage the ``*/1 * * * *`` due to load problems. ``*/3 * * * *`` is the preferred option for cron jobs on non-NOAA Cloud systems. To check the experiment progress: diff --git a/docs/UsersGuide/source/Components.rst b/docs/UsersGuide/source/Components.rst index cbf9a5d28a..17ea7db262 100644 --- a/docs/UsersGuide/source/Components.rst +++ b/docs/UsersGuide/source/Components.rst @@ -19,10 +19,7 @@ These components are documented within this User's Guide and supported through a Pre-processor Utilities and Initial Conditions ============================================== -The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of :term:`halo` cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software :term:`chgres_cube` is used to convert the raw external model data into initial and lateral boundary condition files in :term:`netCDF` format. These are needed as input to the :term:`FV3`-:term:`LAM`. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS User's Guide `__. - -.. - COMMENT: Update link! +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Since the SRW App provides forecast predictions over a limited area (rather than globally), it is necessary to first generate a regional grid (``regional_esg_grid/make_hgrid``) along with :term:`orography` (``orog``) and surface climatology (``sfc_climo_gen``) files on that grid. Grids include a strip, or "halo," of six cells that surround the regional grid and feed in lateral boundary condition data. Since different grid and orography files require different numbers of :term:`halo` cells, additional utilities handle topography filtering and shave the number of halo points (based on downstream workflow component requirements). The pre-processing software :term:`chgres_cube` is used to convert the raw external model data into initial and lateral boundary condition files in :term:`netCDF` format. These are needed as input to the :term:`FV3`-:term:`LAM`. Additional information about the UFS pre-processor utilities can be found in the `UFS_UTILS Technical Documentation `__ and in the `UFS_UTILS Scientific Documentation `__. The SRW Application can be initialized from a range of operational initial condition files. It is possible to initialize the model from the Global Forecast System (:term:`GFS`), North American Mesoscale (:term:`NAM`) Forecast System, Rapid Refresh (:term:`RAP`), and High-Resolution Rapid Refresh (:term:`HRRR`) files in Gridded Binary v2 (:term:`GRIB2`) format. GFS files also come in :term:`NEMSIO` format for past dates. diff --git a/docs/UsersGuide/source/ConfigWorkflow.rst b/docs/UsersGuide/source/ConfigWorkflow.rst index 127435b82b..c67c799308 100644 --- a/docs/UsersGuide/source/ConfigWorkflow.rst +++ b/docs/UsersGuide/source/ConfigWorkflow.rst @@ -48,6 +48,8 @@ Platform Environment ``WFLOW_MOD_FN``: (Default: "") Name of alternative workflow module file to use if running on an unsupported platform. Is set automatically for supported machines. +.. _sched: + ``SCHED``: (Default: "") The job scheduler to use (e.g., Slurm) on the specified ``MACHINE``. Leaving this an empty string allows the experiment generation script to set it automatically depending on the machine the workflow is running on. Valid values: ``"slurm"`` | ``"pbspro"`` | ``"lsf"`` | ``"lsfcray"`` | ``"none"`` @@ -397,7 +399,7 @@ CCPP Parameter | ``"FV3_GFS_v16"`` | ``"FV3_RRFS_v1beta"`` | ``"FV3_HRRR"`` - | ``"FV3_WoFS"`` + | ``"FV3_WoFS_v0"`` **Other valid values include:** diff --git a/docs/UsersGuide/source/InputOutputFiles.rst b/docs/UsersGuide/source/InputOutputFiles.rst index 348142904e..32b52bbb9b 100644 --- a/docs/UsersGuide/source/InputOutputFiles.rst +++ b/docs/UsersGuide/source/InputOutputFiles.rst @@ -27,10 +27,7 @@ The data format for these files can be :term:`GRIB2` or :term:`NEMSIO`. More inf Pre-processing (UFS_UTILS) --------------------------- -When a user generates the regional workflow, as described in :numref:`Section %s `, the workflow generation script links the input data for the pre-processing utilities to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found in the `UFS_UTILS Documentation `__. - -.. - COMMENT: Update link! (UFS_UTILS) +When a user generates the regional workflow, as described in :numref:`Section %s `, the workflow generation script links the input data for the pre-processing utilities to the experiment directory. The pre-processing utilities use many different datasets to create grids and to generate model input datasets from the external model files. A detailed description of the input files for the pre-processing utilities can be found in the UFS_UTILS `Technical Documentation `__ and `Scientific Documentation `__. UFS Weather Model ----------------- @@ -100,10 +97,7 @@ and are shown in :numref:`Table %s `. | README.xml_templating.md | Instructions for Rocoto XML templating with Jinja. | +-----------------------------+--------------------------------------------------------------+ -Additional information related to ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, ``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, while information on ``regional_grid.nml`` can be found in the `UFS_UTILS User's Guide `__. - -.. - COMMENT: Update links! (for UFS_UTILS) +Additional information related to ``diag_table_[CCPP]``, ``field_table_[CCPP]``, ``input.nml.FV3``, ``model_conigure``, and ``nems.configure`` can be found in the `UFS Weather Model User's Guide `__, while information on ``regional_grid.nml`` options can be found in the `UFS_UTILS Technical Documentation `__. Migratory Route of the Input Files in the Workflow ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ diff --git a/docs/UsersGuide/source/Introduction.rst b/docs/UsersGuide/source/Introduction.rst index f21db91249..a03ff219a2 100644 --- a/docs/UsersGuide/source/Introduction.rst +++ b/docs/UsersGuide/source/Introduction.rst @@ -8,11 +8,11 @@ The Unified Forecast System (:term:`UFS`) is a community-based, coupled, compreh The UFS includes `multiple applications `__ that support different forecast durations and spatial domains. This documentation describes the UFS Short-Range Weather (SRW) Application, which targets predictions of atmospheric behavior on a limited spatial domain and on time scales from minutes to several days. The SRW Application v2.0.0 release includes a prognostic atmospheric model, pre- and post-processing, and a community workflow for running the system end-to-end. These components are documented within this User's Guide and supported through a `community forum `_. New and improved capabilities for this release include the addition of a verification package (METplus) for both deterministic and ensemble simulations and support for four stochastically perturbed perturbation schemes. Future work will expand the capabilities of the application to include data assimilation (DA) and a forecast restart/cycling capability. -This documentation provides a :ref:`Quick Start Guide ` for running the SRW Application in a container and a :ref:`detailed guide ` for running the SRW App on supported platforms. It also provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow. +This documentation provides a :ref:`Quick Start Guide ` designed for use on `Level 1 systems `__ or as an overview of the workflow. It also provides a :ref:`Container-Based Quick Start Guide ` for running the SRW Application in a container and a :ref:`detailed guide ` for running the SRW App on any supported platform. Additionally, this User's Guide provides an overview of the :ref:`release components ` and details on how to customize or modify different portions of the workflow. The SRW App v2.0.0 citation is as follows and should be used when presenting results based on research conducted with the App: -UFS Development Team. (2022, June 17). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.6505854 +UFS Development Team. (2022, June 22). Unified Forecast System (UFS) Short-Range Weather (SRW) Application (Version v2.0.0). Zenodo. https://doi.org/10.5281/zenodo.6505854 How to Use This Document @@ -30,7 +30,7 @@ Variables presented as ``AaBbCc123`` in this User's Guide typically refer to var File paths or code that include angle brackets (e.g., ``build__``) indicate that users should insert options appropriate to their SRW App configuration (e.g., ``build_orion_intel``). .. hint:: - * To get started running the SRW App, see the :ref:`Quick Start Guide ` for beginners or refer to the in-depth chapter on :ref:`Running the Short-Range Weather Application `. + * To get started running the SRW App, users can view :numref:`Chapter %s ` for a quick overview of the workflow steps. For more detailed explanations, users can refer to the :ref:`Container-Based Quick Start Guide ` or the in-depth chapter on :ref:`Building and Running the Short-Range Weather Application `. * For background information on the SRW App code repositories and directory structure, see :numref:`Section %s ` below. * For an outline of SRW App components, see section :numref:`Section %s ` below or refer to :numref:`Chapter %s ` for a more in-depth treatment. @@ -73,11 +73,8 @@ The UFS SRW Application has been designed so that any sufficiently up-to-date ma * 53 GB input data for a standard collection of global database, or "fix" data (topography, climatology, observational database) for a short 12-hour test forecast on CONUS 25km domain. See data download instructions in :numref:`Section %s `. * 8 GB for :term:`HPC-Stack` full installation * 3 GB for ``ufs-srweather-app`` installation - * 1 GB boundary conditions for a short 12-h test forecast on CONUS 25km domain. See data download instructions in :numref:`Section %s ` - * 17 GB for a 12-h test forecast on CONUS 25km domain, with model output saved hourly, see :numref:`Section %s ` - - -* 4GB memory (CONUS 25km domain) + * 1 GB for boundary conditions for a short 12-h test forecast on the CONUS 25km domain. See data download instructions in :numref:`Section %s ` + * 17 GB for a 12-h test forecast on the CONUS 25km domain, with model output saved hourly, see :numref:`Section %s ` * Fortran compiler released since 2018 @@ -113,7 +110,7 @@ The following software is also required to run the SRW Application, but the :ter For MacOS systems, some additional software packages are needed. When possible, it is recommended that users install and/or upgrade this software (along with software listed above) using the `Homebrew `__ package manager for MacOS. See :numref:`Chapter %s ` and :numref:`Chapter %s ` for further guidance on installing these prerequisites on MacOS. * bash v4.x -* `gcc@11` compiler package +* GNU compiler suite v.11 or higher with gfortran * cmake * make * coreutils @@ -125,9 +122,6 @@ Optional but recommended prerequisites for all systems: * Rocoto Workflow Management System (1.3.1) * Python packages ``scipy``, ``matplotlib``, ``pygrib``, ``cartopy``, and ``pillow`` for graphics -.. - COMMENT: Lmod is listed as required - .. _ComponentsOverview: SRW App Components Overview @@ -136,10 +130,7 @@ SRW App Components Overview Pre-processor Utilities and Initial Conditions ------------------------------------------------ -The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. One pre-processing utility converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, these files are used as input to the atmospheric model (FV3-LAM). Additional information about the pre-processor utilities can be found in :numref:`Chapter %s ` and in the `UFS_UTILS User’s Guide `__. - -.. - COMMENT: Update link! +The SRW Application includes a number of pre-processing utilities that initialize and prepare the model. Tasks include generating a regional grid along with :term:`orography` and surface climatology files for that grid. One pre-processing utility converts the raw external model data into initial and lateral boundary condition files in netCDF format. Later, these files are used as input to the atmospheric model (FV3-LAM). Additional information about the pre-processor utilities can be found in :numref:`Chapter %s `, in the `UFS_UTILS Technical Documentation `__, and in the `UFS_UTILS Scientific Documentation `__. Forecast Model ----------------- @@ -307,7 +298,7 @@ A number of sub-directories are created under the ``regional_workflow`` director Experiment Directory Structure -------------------------------- -When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Section %s `), a user-defined experimental directory (``EXPTDIR``) is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shows the contents of the experiment directory before running the experiment workflow. +When the user generates an experiment using the ``generate_FV3LAM_wflow.sh`` script (:numref:`Section %s `), a user-defined experimental directory (``$EXPTDIR``) is created based on information specified in the ``config.sh`` file. :numref:`Table %s ` shows the contents of the experiment directory before running the experiment workflow. .. _ExptDirStructure: @@ -435,8 +426,11 @@ A list of available documentation is shown in :numref:`Table %s `. Parameters and valid values are listed in :numref:`Chapter %s `. - #. Load the python environment for the regional workflow. Users on Level 3-4 systems will need to use one of the existing ``wflow_`` modulefiles (e.g., ``wflow_macos``) and adapt it to their system. + #. Load the python environment for the regional workflow. Users on Level 2-4 systems will need to use one of the existing ``wflow_`` modulefiles (e.g., ``wflow_macos``) and adapt it to their system. .. code-block:: console module use module load wflow_ - conda activate regional_workflow + + After loading the workflow, users should follow the instructions printed to the console. For example, if the output says: + + .. code-block:: console + + Please do the following to activate conda: + > conda activate regional_workflow + + then the user should run ``conda activate regional_workflow`` to activate the ``regional_workflow`` environment. #. Generate the experiment workflow. diff --git a/docs/UsersGuide/source/Quickstart.rst b/docs/UsersGuide/source/Quickstart.rst index 942df95e2c..c7d91e3253 100644 --- a/docs/UsersGuide/source/Quickstart.rst +++ b/docs/UsersGuide/source/Quickstart.rst @@ -24,15 +24,22 @@ Prerequisites: Install Singularity To build and run the SRW App using a Singularity container, first install the Singularity package according to the `Singularity Installation Guide `__. This will include the installation of dependencies and the installation of the Go programming language. SingularityCE Version 3.7 or above is recommended. .. warning:: - Docker containers can only be run with root privileges, and users cannot have root privileges on :term:`HPCs`. Therefore, it is not possible to build the SRW, which uses the HPC-Stack, inside a Docker container on an HPC system. A Docker image may be pulled, but it must be run inside a container such as Singularity. + Docker containers can only be run with root privileges, and users cannot have root privileges on :term:`HPCs`. Therefore, it is not possible to build the SRW, which uses the HPC-Stack, inside a Docker container on an HPC system. However, a Singularity image may be built directly from a Docker image for use on the system. -Working in the Cloud ------------------------ +Working in the Cloud or on HPC Systems +----------------------------------------- -For those working on non-cloud-based systems, skip to :numref:`Step %s `. Users building the SRW App using NOAA's Cloud resources must complete a few additional steps to ensure that the SRW App builds and runs correctly. +For users working on systems with limited disk space in their ``/home`` directory, it is recommended to set the ``SINGULARITY_CACHEDIR`` and ``SINGULARITY_TEMPDIR`` environment variables to point to a location with adequate disk space. For example: -On NOAA Cloud systems, certain environment variables must be set *before* building the container: +.. code-block:: + + export SINGULARITY_CACHEDIR= + export SINGULARITY_TEMPDIR= + +where ``/absolute/path/to/writable/directory/`` refers to a writable directory (usually a project or user directory within ``/lustre``, ``/work``, ``/scratch2``, or ``/glade`` on NOAA Level 1 systems). + +On NOAA Cloud systems, the ``sudo su`` command may also be required: .. code-block:: @@ -46,9 +53,10 @@ If the ``cache`` and ``tmp`` directories do not exist already, they must be crea .. note:: ``/lustre`` is a fast but non-persistent file system used on NOAA Cloud systems. To retain work completed in this directory, `tar the files `__ and move them to the ``/contrib`` directory, which is much slower but persistent. + .. _BuildC: -Set Up the Container +Build the Container ------------------------ Build the container: @@ -60,6 +68,51 @@ Build the container: .. hint:: If a ``singularity: command not found`` error message appears, try running: ``module load singularity``. +.. _WorkOnHPC: + +Allocate a Compute Node +-------------------------- + +Those *not* working on HPC systems may skip to the :ref:`next step `. +On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW App. On NOAA's Cloud platforms, the following commands will allocate a compute node: + +.. code-block:: console + + salloc -N 1 + module load gnu openmpi + mpirun -n 1 hostname + ssh + +The third command will output a hostname. Replace ```` in the last command with the output from the third command. After "ssh-ing" to the compute node in the last command, build and run the SRW App from that node. + +The appropriate commands on other Level 1 platforms will vary, and users should consult the `documentation `__ for those platforms. In general, the allocation command will follow one of these two patterns depending on whether the system uses the Slurm or PBS resource manager respectively: + +.. code-block:: console + + salloc -N 1 -n -A -t //" EXTRN_MDL_FILES_ICS=( "gfs.t18z.pgrb2.0p25.f000" ) - EXTRN_MDL_SOURCE_BASEDIR_LBCS="" + EXTRN_MDL_SOURCE_BASEDIR_LBCS="//" EXTRN_MDL_FILES_LBCS=( "gfs.t18z.pgrb2.0p25.f006" "gfs.t18z.pgrb2.0p25.f012") On Level 1 systems, ``/path/to/input_model_data/FV3GFS`` should correspond to the location of the machine's global data, which can be viewed :ref:`here ` for Level 1 systems. Alternatively, the user can add the path to their local data if they downloaded it as described in :numref:`Section %s `. @@ -154,19 +202,20 @@ On NOAA Cloud platforms, users may continue to the :ref:`next step ` after changing these settings. Detailed guidance on the variables in the code fragment above can be found in :numref:`Chapter %s: Configuring the Workflow `. @@ -198,48 +247,6 @@ Next, activate the regional workflow: The user should see ``(regional_workflow)`` in front of the Terminal prompt at this point. -.. _WorkOnHPC: - -Allocate a Compute Node --------------------------- - -Those *not* working on HPC systems may skip to the :ref:`next step `. -On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW App. On NOAA's Cloud platforms, the following commands will allocate a compute node: - -.. code-block:: console - - salloc -N 1 - module load gnu openmpi - mpirun -n 1 hostname - ssh - -The third command will output a hostname. Replace ```` in the last command with the output from the third command. After "ssh-ing" to the compute node in the last command, build and run the SRW App from that node. - -The appropriate commands on other Level 1 platforms will vary, and users should consult the `documentation `__ for those platforms. In general, the allocation command will follow one of these two patterns depending on whether the system uses the Slurm or PBS resource manager respectively: - -.. code-block:: console - - salloc -N 1 -n -A -t