Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update SRW Documentation #212

Merged
merged 80 commits into from
Mar 30, 2022
Merged

Update SRW Documentation #212

merged 80 commits into from
Mar 30, 2022

Conversation

gspetro-NOAA
Copy link
Collaborator

@gspetro-NOAA gspetro-NOAA commented Feb 15, 2022

DESCRIPTION OF CHANGES:

Updates to the SRW Documentation including:

  • Added submodule w/ HPC-stack install docs
  • Documented container vs. non-container approaches
  • New containerized Quickstart doc
  • Build/Run SRW doc (former Quickstart) expanded for a detailed walkthrough of SRW.
  • Merged SRWAppOverview doc into the Build/Run SRW doc.
  • Merged CodeReposAndDirs into Introduction
  • Summarized components section in Intro and created detailed Components section for more thorough information on specific SRW components.
  • Tips for non-Level 1/2 users
  • Edits for readability/concision/newbie-friendliness (esp. to intro)

TESTS CONDUCTED:

None required.
Ran through the workflows on Cloud and Orion.

DEPENDENCIES:

DOCUMENTATION:

All the edits are docs. ;)

ISSUE:

Partially fixes issue mentioned in #211

CONTRIBUTORS:

@gspetro-NOAA @NOAA-EPIC @jkbk2004 @EdwardSnyder-NOAA @trihug @mark-a-potts @laurenfrederick

@gspetro-NOAA gspetro-NOAA marked this pull request as ready for review February 16, 2022 00:52
docs/UsersGuide/source/Introduction.rst Show resolved Hide resolved
docs/UsersGuide/source/Introduction.rst Show resolved Hide resolved
docs/UsersGuide/source/Introduction.rst Show resolved Hide resolved
docs/UsersGuide/source/Introduction.rst Show resolved Hide resolved
docs/UsersGuide/source/Introduction.rst Outdated Show resolved Hide resolved
docs/UsersGuide/source/Introduction.rst Show resolved Hide resolved
docs/UsersGuide/source/Introduction.rst Show resolved Hide resolved
docs/UsersGuide/source/Introduction.rst Outdated Show resolved Hide resolved
docs/UsersGuide/source/Introduction.rst Outdated Show resolved Hide resolved

Throughout the guide, this presentation style indicates shell
commands and options, code examples, etc.
.. bibliography:: references.bib
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We will be supporting the capability to create a user-defined domain in the v2 release, although I'm not sure I would call it a "fully supported capability" quite yet. There are still a lot of manual changes a user needs to make. DA and cycling are coming, so that's good. The ensemble capability already exists. Vertification is now available and will be supported for both deterministic and ensemble simulations, and while we previously included tendency-based stochastic physics (SPPT, SHUM, and SKEB) in the v1 release, Stochastically Perturbed Perturbations (SPP) will also now be included, and all four schemes will now be supported, where none were supported in v1.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could also add that user-defined vertical levels (number and distribution) will be supported in an upcoming release.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Made these changes. Are there other specific improvements planned for the future? I left the grid bullet point in, since I don't think we have expanded the number of predefined grids, and, as you say, the user-defined capability isn't fully supported, even though it was in option in the last release and has expanded support in this one. I can remove it though if you prefer.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's good to leave the user-defined grid capability listed. If I think of anything else, I'll let you know!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, @JeffBeck-NOAA interesting to hear you don't think the user-defined domains should be considered fully supported yet. I commented on that a few times in the documentation that I thought maybe it would be. Is anyone going to be working on making it more user friendly? That is different than being fully supported though.

Copy link
Collaborator

@JeffBeck-NOAA JeffBeck-NOAA Mar 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The reason I wasn't sure about "fully supported" for the user-defined domain capability is that it's the same as it was in version 1.0.0. In other words, the user still needs to make a significant amount of manual modifications to create their own domain in the set_predef_grid_params.sh script. If EPIC is confident they can help users with this feature, then it could be considered fully supported. At some point, it would be preferable to introduce config.sh-based options that a user could leverage to introduce a new pre-defined domain, which would be much more user-friendly.

Copy link
Collaborator

@jwolff-ncar jwolff-ncar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Initial review of the build/run chapter.

docs/UsersGuide/source/BuildRunSRW.rst Outdated Show resolved Hide resolved
docs/UsersGuide/source/BuildRunSRW.rst Outdated Show resolved Hide resolved
docs/UsersGuide/source/BuildRunSRW.rst Show resolved Hide resolved
docs/UsersGuide/source/BuildRunSRW.rst Outdated Show resolved Hide resolved
docs/UsersGuide/source/BuildRunSRW.rst Show resolved Hide resolved
docs/UsersGuide/source/BuildRunSRW.rst Show resolved Hide resolved
docs/UsersGuide/source/BuildRunSRW.rst Outdated Show resolved Hide resolved
ACCOUNT="<my_account>"
EXPT_SUBDIR="<my_expt_name>"
USE_USER_STAGED_EXTRN_FILES="TRUE"
EXTRN_MDL_SOURCE_BASEDIR_ICS="/glade/p/ral/jntp/UFS_SRW_app/model_data/FV3GFS"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This path is wrong and should be:
/glade/p/ral/jntp/UFS_SRW_app/develop/staged_extrn_mdl_files
or if that is copied to v2p0 to remain static for this release:
/glade/p/ral/jntp/UFS_SRW_app/v2p0/staged_extrn_mdl_files

All of these paths should be checked against what is defined in the regional_workflow code base and I am tagging @mkavulich here to help comment on this.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I switched it to "/glade/p/ral/jntp/UFS_SRW_app/develop/staged_extrn_mdl_files" for now.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great. Mike opened a PR that should be followed here: #231

MACHINE="hera"
ACCOUNT="<my_account>"
EXPT_SUBDIR="<my_expt_name>"

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you want to add a path to the staged data for these machines as well?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably a good idea, although it was only listed for certain Level 1 systems in the original docs. Where do I find the paths?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is on the wiki: https://github.com/ufs-community/ufs-srweather-app/wiki/Getting-Started#pre-staged-raw-initial-conditions

We will want to make sure the wiki documentation is updated accordingly too.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another question: When would we set USE_USER_STAGED_EXTRN_FILES="False"? In the docs, it says that if people plan to use external data/data they have downloaded and staged themselves, they need to set this variable to "True," but it looks like it should be true even when using pre-staged/pre-existing data on Level 1 machines. I was originally under the impression that we didn't have to set the paths on Level 1 machines, but I have had to do so on all of them...

Copy link
Collaborator

@JeffBeck-NOAA JeffBeck-NOAA Mar 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The USE_USER_STAGED_EXTRN_FILES should be set to "TRUE" when the workflow needs to use staged data that either a user has provided or when using data staged by others (e.g., the WE2E external model data). The only time this flag can be set to "FALSE" is when a user is attempting to fetch external model data from HPSS (from a connected NOAA HPC machine) or from one of the online data archives (AWS, Google, etc; this functionality is still a work in progress).

docs/UsersGuide/source/BuildRunSRW.rst Show resolved Hide resolved
Copy link
Collaborator

@jwolff-ncar jwolff-ncar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added more comments but still not finished with all files yet.

docs/UsersGuide/source/Components.rst Outdated Show resolved Hide resolved
==============

The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere
(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here <https://ufs-weather-model.readthedocs.io/en/ufs-v2.0.0/>`__.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment for all these components applies - will these versions be updated for this release of the SRW App? I would think there would be quite a few changes possibly needed for the UFS WM as it has changed a lot in a year so it likely should be updated to v3.0.0.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm still seeing only a v2.0.0 tag on the WM repo and in their docs. I've changed the docs to represent "latest" in most cases but can change them at a later date to a specific version for the SRW App release.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be good to make sure someone is working on the updated documentation for the WM UG. It will need to be tagged for the new release.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it would be good to check and make sure that the WM UG is being updated and there is a plan to tag something for the v3 release. I am not sure who is coordinating that.

The prognostic atmospheric model in the UFS SRW Application is the Finite-Volume Cubed-Sphere
(:term:`FV3`) dynamical core configured with a Limited Area Model (:term:`LAM`) capability :cite:`BlackEtAl2020`. The dynamical core is the computational part of a model that solves the equations of fluid motion. A User’s Guide for the UFS :term:`Weather Model` is `here <https://ufs-weather-model.readthedocs.io/en/ufs-v2.0.0/>`__.

Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here <https://noaa-emc.github.io/FV3_Dycore_ufs-v2.0.0/html/index.html>`__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website <https://www.gfdl.noaa.gov/fv3/>`_.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is the intention to provide full support for the user defined domains now? If so, this text should be modified. It was preliminary for v1.0 release but I think it is probably able to be fully supported now (but leave that up to EPIC to decide for sure).


Supported model resolutions in this release include 3-, 13-, and 25-km predefined Contiguous U.S. (:term:`CONUS`) domains, each with 64 vertical levels. Preliminary tools for users to define their own domain are also available in the release with full, formal support of these tools to be provided in future releases. The Extended Schmidt Gnomonic (ESG) grid is used with the FV3-LAM, which features relatively uniform grid cells across the entirety of the domain. Additional information about the FV3 dynamical core can be found `here <https://noaa-emc.github.io/FV3_Dycore_ufs-v2.0.0/html/index.html>`__ and on the `NOAA Geophysical Fluid Dynamics Laboratory website <https://www.gfdl.noaa.gov/fv3/>`_.

Interoperable atmospheric physics, along with the Noah Multi-parameterization (Noah MP) Land Surface Model options, are supported through the Common Community Physics Package (:term:`CCPP`), described `here <https://dtcenter.org/community-code/common-community-physics-package-ccpp>`__. Atmospheric physics are a set of numerical methods describing small-scale processes such as clouds, turbulence, radiation, and their interactions. There are two physics options supported for the release. The first is an experimental physics suite being tested for use in the future operational implementation of the Rapid Refresh Forecast System (RRFS) planned for 2023-2024, and the second is an updated version of the physics suite used in the operational Global Forecast System (GFS) v15. A scientific description of the CCPP parameterizations and suites can be found in the `CCPP Scientific Documentation <https://dtcenter.ucar.edu/GMTB/v5.0.0/sci_doc/index.html>`_, and CCPP technical aspects are described in the `CCPP Technical Documentation <https://ccpp-techdoc.readthedocs.io/en/v5.0.0/>`_. The model namelist has many settings beyond the physics options that can optimize various aspects of the model for use with each
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are 4 supported physics options supported for this release. See this doc: https://docs.google.com/spreadsheets/d/1m8RYA485SyMe0BSLya0rUJZKABOQMXJf7PVzufgulIQ/edit?pli=1#gid=0 to get the details needed to update this paragraph.

Copy link
Collaborator Author

@gspetro-NOAA gspetro-NOAA Mar 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This document has the physics suites listed, and I could potentially copy the table into the docs so people have more info, but I'm wondering why someone would choose one of these versus another. Is there any documentation that describes the purpose/benefits/uses of each physics suite? For example, I think some are better for high-resolution forecasts and others for low-resolution, if I recall correctly. It's unclear to me where in the CCPP docs I'd find that kind of info (if anywhere), since I can only see GFS v16beta and RRFS v1alpha. I remember they were talking about updating the docs at one of the meetings... Should I ask Ligia?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are different physics suites being tested/tuned by different organizations. It is possible that some of these would be used for a multi-physics suite ensemble in the future too. I don't know if there is documentation on what suite to chose when. You can ask Ligia but I am guessing that will be hard to come by.

Post-processor
==============

The SRW Application is distributed with the Unified Post Processor (:term:`UPP`) included in the workflow as a way to convert the netCDF output on the native model grid to :term:`GRIB2` format on standard isobaric vertical coordinates. The UPP can also be used to compute a variety of useful diagnostic fields, as described in the `UPP User’s Guide <https://upp.readthedocs.io/en/upp-v9.0.0/>`__.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@fossell Can you provide the updated UPP User's Guide link (or plan for what it will be)?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jwolff-ncar For now I have just updated it to "latest," but we can change it later.

Copy link
Collaborator

@jwolff-ncar jwolff-ncar Mar 29, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good. However, it will be important to get tags of the documentation that go with this release otherwise if something changes in develop (latest) and it doesn't match the release that will cause problems for users in the future.

Once built, the provided experiment generator script can be used to create a Rocoto-based
workflow file that will run each task in the system in the proper sequence (see `Rocoto documentation
<https://github.com/christopherwharrop/rocoto/wiki/Documentation>`_). If Rocoto and/or a batch system is not present on the available platform, the individual components can be run in a stand-alone, command line fashion with provided run scripts. The generated namelist for the atmospheric model can be modified in order to vary settings such as forecast starting and ending dates, forecast length hours, the :term:`CCPP` physics suite, integration time step, history file output frequency, and more. It also allows for configuration of other elements of the workflow; for example, whether to run some or all of the pre-processing, forecast model, and post-processing steps.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, this also just occurred to me - we will need to mention the ensemble capability in the UG as well. I think @JeffBeck-NOAA was going to work on that at some point! (Again, like the vx, that can be added later in a separate PR)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jwolff-ncar, yep, we'll need ensemble and stochastic physics included in the documentation. I added some fairly detailed in-line comments within the original PRs, and those will be a good place to start.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JeffBeck-NOAA Can you link to the PR's and/or list the numbers I should look at?

Copy link
Collaborator

@JeffBeck-NOAA JeffBeck-NOAA Mar 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes! PR #237 and PR #685 for the tendency-based and SPP stochastic physics, respectively. More SPP information is available in the ufs-weather-model-related issue here. Ensemble mode was implemented in PR #245, with a bug fix here and here.

The namelists and configuration files for the SRW Application are created from templates by the
workflow, as described in :numref:`Section %s <WorkflowTemplates>`.
The input files for the weather model include both static (fixed) files and grid- and date-specific files (terrain, initial conditions, boundary conditions, etc). The static fix files
must be staged by the user unless you are running on a Level 1/pre-configured platform, in which case you can link to the existing copy of the data on that machine. See :numref:`Section %s <StaticFixFiles>` for more information. The static, grid, and date-specific files are linked in the experiment directory by the workflow scripts. An extensive description of the input files for the weather model can be found in the `UFS Weather Model User's Guide <https://ufs-weather-model.readthedocs.io/en/ufs-v2.0.0/>`__. The namelists and configuration files for the SRW Application are created from templates by the workflow, as described in :numref:`Section %s <WorkflowTemplates>`.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update WM UG version link as necessary. Maybe this will be done in the release branch but it could probably be pushed to develop as well as we would want the latest to be there as well.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should I just have it point to "latest" instead of v2.0.0? Same for the other UG's?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think ultimately these will need to be version numbers, not latest, for a tagged release. But latest can be used in develop and it can be updated in the release branch when that is created.

docs/UsersGuide/source/InputOutputFiles.rst Outdated Show resolved Hide resolved
docs/UsersGuide/source/InputOutputFiles.rst Show resolved Hide resolved
and staged on your machine. The paths to the staged files must then be set in ``config.sh``
as follows:
The environment variables ``FIXgsm``, ``TOPO_DIR``, and ``SFC_CLIMO_INPUT_DIR`` indicate the path to
the directories where the static files are located. If you are on a pre-configured or configurable platform (i.e., a Level 1 or 2 platform), there is no need to stage the fixed files manually because they have been prestaged, and the paths are set in ``regional_workflow/ush/setup.sh``. On Level 3 & 4 systems, the static files can be downloaded individually or as a full tar file from the `FTP data repository <https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/fix/>`__ or from `Amazon Web Services (AWS) cloud storage <https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/fix/fix_files.tar.gz>`__ using the ``wget`` command. Then ``tar -xf <filename>`` will extract the compressed file:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This tar file will need to be recreated for v2p0. Maybe creating an issue in regional_workflow would be appropriate so it is not forgotten.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like you did this already?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have not made these changes to make sure these tar files are created and staged in the following dirs:
https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v2p0/fix/
https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v2.0.0/fix/fix_files.tar.gz

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, I just meant that it looks like you created an issue in regional_workflow so that someone would take care of it as part of the release.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, yes!

<https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/ic/gst_model_data.tar.gz>`_.
The SRW Application currently supports raw initial and lateral boundary conditions from numerous models (i.e., FV3GFS, NAM, RAP, HRRR). The data can be provided in three formats: :term:`NEMSIO`, netCDF, or :term:`GRIB2`. The SRW Application currently only supports the use of NEMSIO and netCDF input files from the GFS.

The data required to run the "out-of-the-box" SRW App case described in :numref:`Chapter %s <QuickstartC>` is already preinstalled on `Level 1 <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__ systems. Users on other systems can find the required IC/LBC data in the `FTP data repository <https://ftp.emc.ncep.noaa.gov/EIB/UFS/SRW/v1p0/simple_test_case/gst_model_data.tar.gz>`__ or on `AWS cloud storage <https://ufs-data.s3.amazonaws.com/public_release/ufs-srweather-app-v1.0.0/ic/gst_model_data.tar.gz>`_.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few more files to update for v2p0. Add to issue created for above comment.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Link used below too so need an update there as well.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The commands in these files can be directly copy-pasted to the command line or the file can be sourced.
You may need to modify certain variables such as the path to NCEP libraries for your individual platform,
or use ``setenv`` rather than ``export`` depending on your environment:
For those working on non-cloud-based systems, skip to :numref:`Step %s <WorkOnHPC>`. Users building the SRW App using NOAA's Cloud resources must complete a few additional steps to ensure that the SRW App builds and runs correctly.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it worth mentioning who has access to NOAA's cloud resources? The community might be interested to know how accessible this is.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the issue is that NOAA's Cloud systems are just as accessible as any level 1 system. In other words, not very. You have to be NOAA-affiliated. Eventually, I think EPIC is supposed to provide a public-facing cloud-based IDE/dev environment, but we're not there yet... and probably won't be for a while. :-/

Generate the Workflow Experiment
================================
Generating the workflow experiment requires three steps:
The appropriate commands on other Level 1 platforms will vary, and users should consult the documentation for those platforms.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A link here to the appropriate section to start with might be helpful.

Copy link
Collaborator Author

@gspetro-NOAA gspetro-NOAA Mar 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jwolff-ncar The RDHPCS docs are not publicly available. Should I link anyway? I could refer them to the RDHPCS help email, too/instead... that might actually be more useful, since I'm not sure that the RDHPCS docs list actual code commands to run to allocate a compute node. This was a conversation we had a little while ago. Alternatively, I could contact the help desk or @mark-a-potts and add the commands for each Level 1 system. Thoughts?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it gets too complicated, it is fine as is.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a link to this page, which has links to the docs for various platforms.


Run ``cmake`` to set up the ``Makefile``, then run ``make``:
Those *not* working on HPC systems may skip to the :ref:`next step <BuildC>`.
On HPC systems (including NOAA's Cloud platforms), allocate a compute node on which to run the SRW App. On NOAA's Cloud platforms, the following commands will allocate a compute node:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So I would run these same steps on Hera or Cheyenne? If not, could one example for that be shown to help users? I don't know how many users will have access to NOAA's cloud platform so I want to make sure that it is also straightforward to run the container on an HPC as well. Or, in that case do you suggest not using the container and using the other instructions? Maybe just make that more clear?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My understanding is that the commands will differ depending on the resource manager (Slurm or PBS) being used. Those commands work for NOAA Cloud but would be slightly different for other systems--and different between each other. @mark-a-potts @EdwardSnyder-NOAA @jkbk2004 How would we allocate compute nodes on the various HPC's?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I added a section with some links and examples :)


For Orion:
Download and Stage the Data
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is any data available in the container? Or is the user required to stage it first? I guess I am getting a little confused about what level the container is considered? Does it depend on the machine you run on? If it is on the cloud, what is that considered? Sorry for my confusion here.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Np! NOAA Cloud is Level 1. The data is too big to fit in the container, so it's not included. In the container process, users bind the container to the rest of the system so that they can access the system's data. So users can just use the data that's on the platform they're using if it's a Level 1 or 2 system. I just tried to clarify that in the docs.

docs/UsersGuide/source/Quickstart.rst Outdated Show resolved Hide resolved
Copy link
Collaborator

@jwolff-ncar jwolff-ncar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We will continue to iterate on the remaining issues I raised in separate PRs for vx and for tag updates, etc. Thanks!

@gspetro-NOAA gspetro-NOAA merged commit 31dab61 into ufs-community:develop Mar 30, 2022
christinaholtNOAA referenced this pull request in NOAA-GSL/ufs-srweather-app Jun 7, 2022
* Add support on NSSL/Odin (#227)

* Add support on NSSL/Odin

* Add wlfow_odin.env and modify detect_machine.sh

* update comment

* Detect explicitly odin1/odin2

* Add python module to cheyenne build environments (#232)

* Update SRW Documentation (#212)

* updated docs

* added git submodule

* fix formatting

* added new submodule commits

* fixed ref links

* finished Intro

* finish Components & Intro edits

* edited Rocoto workflow section of Quickstart

* added minor hpc submodule commits

* Updates to Rocoto Workflow in Quick Start

* add to HPC-stack intro

* submodule updates

* added submodule docs edits

* hpc-stack updates & formatting fixes

* hpc-stack intro edits

* bibtex attempted fix

* add hpc-stack module edits

* update sphinxcontrib version

* add .readthedocs.yaml file

* update .readthedocs.yaml file

* update .readthedocs.yaml file

* update conf.py

* updates .readthedocs.yaml with submodules

* updates .readthedocs.yaml with submodules

* submodule updates

* submodule updates

* minor Intro edits

* minor Intro edits

* minor Intro edits

* submodule updates

* fixed typos in QS

* QS updates

* QS updates

* QS updates

* updates to InputOutput and QS

* fix I/O doc typos

* pull updates to hpc-stack docs

* pull updates to hpc-stack docs

* fix table wrapping

* updates to QS for cloud

* fix QS export statements

* fix QS export statements

* QS edits on bind, config

* add bullet points to notes

* running without rocoto

* add HPC-Stack submodule w/docs

* split QS into container/non-container approaches

* added filepath changes for running in container on Orion, et al.

* edits to overview and container QS

* moved CodeReposAndDirs.rst info to the Introduction & deleted file

* continued edits to SRWAppOverview

* combine overview w/non-container docs

* finish merging non-container guide & SRWOverview, rename/remove files, update FAQ

* minor edits for Intro & QS

* updates to BuildRun doc through 3.8.1

* edits to Build/Run and Components

* remove .gitignore

* fix Ch 3 title, 4 supported platform levels note

* fix typos, add term links

* other minor fixes/suggestions implemented

* updated Intro based on feedback; changed SRW to SRW App throughout

* update comment to Intro citation

* add user-defined vertical levels to future work

* Add instructions for srw_common module load

* fix typo

* update Intro & BuildRunSRW based on Mark's feedback

* minor intro updates

* 1st round of jwolff's edits

* 2nd round of jwolff updates

* update QS intro

* fix minor physics details

* update citation and physics suite name

* add compute node allocation info to QS

* add authoritative hpc-stack docs to Intro

Co-authored-by: gspetro <gillian.s.petro@gmail.com>

* Update hashes of all components to latest versions (#233)

* Add a Contributor's Guide feature to the docs (#228)

* create contributor's guide

* add guidelines for making good PR

* good pull request edits

* 1st Draft of COntributor's Guide

* minor formatting edits

* Add instructions for srw_common module load

* fix module files guidance

* fix typo

* update Intro & BuildRunSRW based on Mark's feedback

* minor intro updates

* add info on documentation requirements

* 1st round of J. Beck's edits

* add textonly issue template + minor typo-type fixes

* minor edits/formatting

* remove pylintrc requirement

* fixed .yaml & other minor

Co-authored-by: gspetro <gillian.s.petro@gmail.com>

* Updates to parameters in config_defaults .rst files (#237)

* edit config intro & platform environment sections

* edit sections on cron & directory parameters, platform & parameters for running without a workflow manager

* edit NCO, file-separator, filename params, add some METplus and model config params

* ConfigWorkflow.rst revisions, added METplus to Components, grid info

* add grid config details

* changes to readme.md

* RTD readme.md edits

* create MacOS install/build instructions

* update task run and grid parameters

* fixed file params & workflow task params

* 1st draft of ConfigParameters.inc

* minor edits

* add stochastic physics var details

* update FVCOM, thread affinity params

* halo_blend, ens, crtm, custom post, subhourly updates

* update HPC-Stack submodule/docs

* Rocoto WF tasks & params

* workflow tasks/params, debug, verbose, pre-existing dir, predefined grid

* move Stochastic physics to CCP section; write component edits

* comp'l forecast, grid gen, NOMADS, user-staged files

* METplus, model config & forecast params, separator

* 2nd draft complete

* physics updates

* remove MacInstall empty file

* undo hpc-stack submodule update (save for separate PR)

* undo hpc-stack install doc update (save for separate PR)

* revisions to SPP & LSM physics

* minor edits

* update comments in LAM Grid chapter

* update LSM_SPP_EACH_STEP

* revert LSM_SPP_EACH_STEP to original definition

* combine config info into one doc instead of two

Co-authored-by: gspetro <gillian.s.petro@gmail.com>

* Added functionality for MacOS X (#242)

* Added functionality for MacOS X

 Functionality for MacOS, updated module list in srw_common

* Update build_macos_gnu.env

* Update srw_common

* Update build_macos_gnu.env

The env/build_mac_gnu.env does not load srw_common module, but instead loads individual HPC-stack modules built locally on the Mac that contain higher-versions of some packages. This avoids conflicts with SRW builds for other platforms.

* Update srw_common

corrected the version of the gftl-shared

* Update build_macos_gnu.env

No need to load libpng module found in srw_common, as this is not being built as a part of the hpc-stack on MacOS X, rather installed system-wide.

Co-authored-by: Natalie Perlin <Natalie@Natalies-MacBook-Air.local>

* Add gaea to supported platforms (#236)

* fixes for gaea

* updates for gaea

* tweak for build env

* 2nd tweak for build env

* Fixes for slurm

* another fix for env

* added version for cmake

* Update Externals.cfg

* Update wflow_gaea.env

* tweak

* pulling Externals.cfg explicitly from develop

* temporarily removing externals

* Checked out directly from develop

* Bug fix with singularity env files (#245)

* Replace bash env files with modules  (#238)

* Pass machine name to build scripts.

* Use modules environment instead of shell scripts.

* Leave conda activation to the user.

* Remove set_machine script.

* Rename env to modulefiles

* Minor fix.

* Minor fix

* Take out *module purge* from modufiles and put it in devbuild.sh

* Activate conda directly in signularity modulefile.

* Minor fixes.

* Add Gaea modulefiles.

* Restore odin env files.

* Bug fixes in singularity modulefiles.

* Move activation of Lmod to devbuild.sh

* Don't do 'module purge' on cray systems

* Put Lmod initialization code in separate script.

* Go back to using modulefile for odin.

* Optionally pass machine name to lmod-setup.sh

* Modify odin wflow modulefile.

* Allow unknown platforms in devbuild.sh

* Update documentation.

* Move cmake init out of lmod-setup.sh on odin

* Also update markup language build documentation.

* Lmod setup script for both bash and tcsh login shells.

* Some fixes for tcsh login shell.

* Add singularity platform to lmod-setup

* update hash of regional workflow (#247)

* Update WE2E documentation (#241)

## DESCRIPTION OF CHANGES: 
This updates the documentation for how to use the WE2E testing system.

## TESTS CONDUCTED: 
Compiled the documentation using `sphinx-build` and viewed the resulting html.

## DEPENDENCIES:
PR #[745](ufs-community/regional_workflow#745) in regional_workflow.

## CONTRIBUTORS: 
@gspetro

* fixes for gaea modules (#248)

* Update regional_workflow hash, add shortcuts for common devbuild.sh options (#251)

* Modifications to `CODEOWNERS` file (#252)

## DESCRIPTION OF CHANGES: 
Make the following modifications to the github `CODEOWNERS` file:
1) Add several EPIC staff (@gspetro-NOAA, @natalie-perlin, and @EdwardSnyder-NOAA) so they are notified of all PRs and can review them.
2) Remove duplicate entries.
3) Remove users who will no longer be working with the repo (thus far only @jwolff-ncar) .

## TESTS CONDUCTED: 
None.

* Add verification tasks to documentation (#243)


* METplus, model config & forecast params, separator

* remove MacInstall empty file

* undo hpc-stack submodule update (save for separate PR)

* undo hpc-stack install doc update (save for separate PR)

* combine config info into one doc instead of two

* remove ConfigParameters.inc (contents now appear in ConfigWorkflow.rst)

* add VX tables, config info, & Rocoto output tables

* add module use/load statements, fix typos

* varied minor details

* add workflow svg diagram

* condense VX task table using ##

* update README

* add png and revert hpc-stack commits until PR#240 (mac docs) is approved

* jwolff edits

* add info on run_vx.local

Co-authored-by: gspetro <gillian.s.petro@gmail.com>

* Add NOAA cloud platforms to SRW (#221)

* updates for noaacloud

* working version

* pointing to noaa-epic for testing

* changes for noaacloud

* switched to load-any

* fix for regional_workflow pointer (#260)

* update hash of regional workflow (#261)

* Feature/cheyenne fix (#258)

* Adding a github action to build on cheyenne with intel

* fixing yml

* fixes for missing load-any on cheyenne

* added pio as well

* Update .github/workflows/build.yml

Co-authored-by: Will Mayfield <59745143+willmayfield@users.noreply.github.com>

Co-authored-by: Will Mayfield <59745143+willmayfield@users.noreply.github.com>

* tweaks for build/run on gaea (#254)

* tweaks for build/run on gaea

* fixed path

* Updates for PR

* Check-in Jenkinsfile and unified scripts (#253)

* Add Jenkinsfile that includes a build and test pipeline, which
leverages the unified build and test scripts
* Supported platforms are Cheyenne, Gaea, and Orion
* Supported compilers are GNU and Intel

* Fix for miniconda3 load on Hera (#257)

Pin down the version of miniconda3 on Hera, and do not append to the module path

* Updates to Remaining Chapters (6 & 8-12) of SRW Docs/User's Guide (#255)

* updated docs

* added git submodule

* fix formatting

* added new submodule commits

* fixed ref links

* finished Intro

* finish Components & Intro edits

* edited Rocoto workflow section of Quickstart

* added minor hpc submodule commits

* Updates to Rocoto Workflow in Quick Start

* add to HPC-stack intro

* submodule updates

* added submodule docs edits

* hpc-stack updates & formatting fixes

* hpc-stack intro edits

* bibtex attempted fix

* add hpc-stack module edits

* update sphinxcontrib version

* add .readthedocs.yaml file

* update .readthedocs.yaml file

* update .readthedocs.yaml file

* update conf.py

* updates .readthedocs.yaml with submodules

* updates .readthedocs.yaml with submodules

* submodule updates

* submodule updates

* minor Intro edits

* minor Intro edits

* minor Intro edits

* submodule updates

* fixed typos in QS

* QS updates

* QS updates

* QS updates

* updates to InputOutput and QS

* fix I/O doc typos

* pull updates to hpc-stack docs

* pull updates to hpc-stack docs

* fix table wrapping

* updates to QS for cloud

* fix QS export statements

* fix QS export statements

* QS edits on bind, config

* add bullet points to notes

* running without rocoto

* add HPC-Stack submodule w/docs

* split QS into container/non-container approaches

* added filepath changes for running in container on Orion, et al.

* edits to overview and container QS

* moved CodeReposAndDirs.rst info to the Introduction & deleted file

* continued edits to SRWAppOverview

* combine overview w/non-container docs

* finish merging non-container guide & SRWOverview, rename/remove files, update FAQ

* minor edits for Intro & QS

* updates to BuildRun doc through 3.8.1

* edits to Build/Run and Components

* remove .gitignore

* fix Ch 3 title, 4 supported platform levels note

* fix typos, add term links

* other minor fixes/suggestions implemented

* updated Intro based on feedback; changed SRW to SRW App throughout

* update comment to Intro citation

* add user-defined vertical levels to future work

* Add instructions for srw_common module load

* fix typo

* update Intro & BuildRunSRW based on Mark's feedback

* minor intro updates

* 1st round of jwolff's edits

* 2nd round of jwolff updates

* update QS intro

* fix minor physics details

* update citation and physics suite name

* add compute node allocation info to QS

* add authoritative hpc-stack docs to Intro

* edit config intro & platform environment sections

* edit sections on cron & directory parameters, platform & parameters for running without a workflow manager

* edit NCO, file-separator, filename params, add some METplus and model config params

* ConfigWorkflow.rst revisions, added METplus to Components, grid info

* add grid config details

* changes to readme.md

* RTD readme.md edits

* create MacOS install/build instructions

* update task run and grid parameters

* fixed file params & workflow task params

* 1st draft of ConfigParameters.inc

* minor edits

* add stochastic physics var details

* update FVCOM, thread affinity params

* halo_blend, ens, crtm, custom post, subhourly updates

* update HPC-Stack submodule/docs

* remove extra macinstall document

* Rocoto WF tasks & params

* workflow tasks/params, debug, verbose, pre-existing dir, predefined grid

* move Stochastic physics to CCP section; write component edits

* comp'l forecast, grid gen, NOMADS, user-staged files

* METplus, model config & forecast params, separator

* 2nd draft complete

* physics updates

* remove MacInstall empty file

* undo hpc-stack submodule update (save for separate PR)

* undo hpc-stack install doc update (save for separate PR)

* revert hpc-stack submodule update

* revisions to SPP & LSM physics

* minor edits

* update comments in LAM Grid chapter

* update LSM_SPP_EACH_STEP

* revert LSM_SPP_EACH_STEP to original definition

* combine config info into one doc instead of two

* remove ConfigParameters.inc (contents now appear in ConfigWorkflow.rst)

* update hpc-stack docs submodule

* odds & ends

* add VX tables, config info, & Rocoto output tables

* add module use/load statements, fix typos

* varied minor details

* add workflow svg diagram

* edits to rocoto ch

* updates to Rocoto chapter

* fix minor formatting/wording issues

* updates to LAMgrid chapter

* LAM Grid edits

* LAM ch: user-defined grid section

* add UPP Product tables ch 6

* I/O edits & glossary terms

* I/O Pt2

* I/O changes

* include updated images

* update docs to reflect changes in PR #238

* Graphics Ch-1st pass

* minor updates to Graphics

* minor updates to Graphics

* edit ConfigNewPlatform sections 1-4

* ConfigNewPlatform edits

* resolve merge conflicts

* I/O ch edits

* I/O edits

* more I/O edits

* hpc-stack submodule updates

* add HPC-Stack MacOs info

* WE2E edits & tables

* fix typo

* minor grammar/typos

* merge conflict resolution

* merge conflict resolution

* fix grid name

* remove resolved comments

* add compact grids

* file path updates & info for HPC-Stack

* add SRW prereqs to Intro

* change ConfigNewPlatform to a non-container quickstart

* clean up non-container quickstart

* update build options for non-container QS

* update file paths & WE2E

* minor fixes

* update I/O & Gaea file paths

* update error in non-container QS

* add warning for users w/o Rocoto

* add UPP Satellite Product instructions

* Xlink for UPP satellite output info

Co-authored-by: gspetro <gillian.s.petro@gmail.com>

* Update compiler prerequisite in docs (#267)

* updated docs

* added git submodule

* fix formatting

* added new submodule commits

* fixed ref links

* finished Intro

* finish Components & Intro edits

* edited Rocoto workflow section of Quickstart

* added minor hpc submodule commits

* Updates to Rocoto Workflow in Quick Start

* add to HPC-stack intro

* submodule updates

* added submodule docs edits

* hpc-stack updates & formatting fixes

* hpc-stack intro edits

* bibtex attempted fix

* add hpc-stack module edits

* update sphinxcontrib version

* add .readthedocs.yaml file

* update .readthedocs.yaml file

* update .readthedocs.yaml file

* update conf.py

* updates .readthedocs.yaml with submodules

* updates .readthedocs.yaml with submodules

* submodule updates

* submodule updates

* minor Intro edits

* minor Intro edits

* minor Intro edits

* submodule updates

* fixed typos in QS

* QS updates

* QS updates

* QS updates

* updates to InputOutput and QS

* fix I/O doc typos

* pull updates to hpc-stack docs

* pull updates to hpc-stack docs

* fix table wrapping

* updates to QS for cloud

* fix QS export statements

* fix QS export statements

* QS edits on bind, config

* add bullet points to notes

* running without rocoto

* add HPC-Stack submodule w/docs

* split QS into container/non-container approaches

* added filepath changes for running in container on Orion, et al.

* edits to overview and container QS

* moved CodeReposAndDirs.rst info to the Introduction & deleted file

* continued edits to SRWAppOverview

* combine overview w/non-container docs

* finish merging non-container guide & SRWOverview, rename/remove files, update FAQ

* minor edits for Intro & QS

* updates to BuildRun doc through 3.8.1

* edits to Build/Run and Components

* remove .gitignore

* fix Ch 3 title, 4 supported platform levels note

* fix typos, add term links

* other minor fixes/suggestions implemented

* updated Intro based on feedback; changed SRW to SRW App throughout

* update comment to Intro citation

* add user-defined vertical levels to future work

* Add instructions for srw_common module load

* fix typo

* update Intro & BuildRunSRW based on Mark's feedback

* minor intro updates

* 1st round of jwolff's edits

* 2nd round of jwolff updates

* update QS intro

* fix minor physics details

* update citation and physics suite name

* add compute node allocation info to QS

* add authoritative hpc-stack docs to Intro

* edit config intro & platform environment sections

* edit sections on cron & directory parameters, platform & parameters for running without a workflow manager

* edit NCO, file-separator, filename params, add some METplus and model config params

* ConfigWorkflow.rst revisions, added METplus to Components, grid info

* add grid config details

* changes to readme.md

* RTD readme.md edits

* create MacOS install/build instructions

* update task run and grid parameters

* fixed file params & workflow task params

* 1st draft of ConfigParameters.inc

* minor edits

* add stochastic physics var details

* update FVCOM, thread affinity params

* halo_blend, ens, crtm, custom post, subhourly updates

* update HPC-Stack submodule/docs

* remove extra macinstall document

* Rocoto WF tasks & params

* workflow tasks/params, debug, verbose, pre-existing dir, predefined grid

* move Stochastic physics to CCP section; write component edits

* comp'l forecast, grid gen, NOMADS, user-staged files

* METplus, model config & forecast params, separator

* 2nd draft complete

* physics updates

* remove MacInstall empty file

* undo hpc-stack submodule update (save for separate PR)

* undo hpc-stack install doc update (save for separate PR)

* revert hpc-stack submodule update

* revisions to SPP & LSM physics

* minor edits

* update comments in LAM Grid chapter

* update LSM_SPP_EACH_STEP

* revert LSM_SPP_EACH_STEP to original definition

* combine config info into one doc instead of two

* remove ConfigParameters.inc (contents now appear in ConfigWorkflow.rst)

* update hpc-stack docs submodule

* odds & ends

* add VX tables, config info, & Rocoto output tables

* add module use/load statements, fix typos

* varied minor details

* add workflow svg diagram

* edits to rocoto ch

* updates to Rocoto chapter

* fix minor formatting/wording issues

* updates to LAMgrid chapter

* LAM Grid edits

* LAM ch: user-defined grid section

* add UPP Product tables ch 6

* I/O edits & glossary terms

* I/O Pt2

* I/O changes

* include updated images

* update docs to reflect changes in PR #238

* Graphics Ch-1st pass

* minor updates to Graphics

* minor updates to Graphics

* edit ConfigNewPlatform sections 1-4

* ConfigNewPlatform edits

* resolve merge conflicts

* I/O ch edits

* I/O edits

* more I/O edits

* hpc-stack submodule updates

* add HPC-Stack MacOs info

* WE2E edits & tables

* fix typo

* minor grammar/typos

* merge conflict resolution

* merge conflict resolution

* fix grid name

* remove resolved comments

* add compact grids

* file path updates & info for HPC-Stack

* add SRW prereqs to Intro

* change ConfigNewPlatform to a non-container quickstart

* clean up non-container quickstart

* update build options for non-container QS

* update file paths & WE2E

* minor fixes

* update I/O & Gaea file paths

* update error in non-container QS

* add warning for users w/o Rocoto

* add UPP Satellite Product instructions

* Xlink for UPP satellite output info

* clarify compiler prereqs

Co-authored-by: gspetro <gillian.s.petro@gmail.com>

Co-authored-by: Yunheng Wang <47898913+ywangwof@users.noreply.github.com>
Co-authored-by: Will Mayfield <59745143+willmayfield@users.noreply.github.com>
Co-authored-by: Gillian Petro <96886803+gspetro-NOAA@users.noreply.github.com>
Co-authored-by: gspetro <gillian.s.petro@gmail.com>
Co-authored-by: Michael Kavulich <kavulich@ucar.edu>
Co-authored-by: Natalie Perlin <68030316+natalie-perlin@users.noreply.github.com>
Co-authored-by: Natalie Perlin <Natalie@Natalies-MacBook-Air.local>
Co-authored-by: Mark Potts <33099090+mark-a-potts@users.noreply.github.com>
Co-authored-by: danielabdi-noaa <52012304+danielabdi-noaa@users.noreply.github.com>
Co-authored-by: Chan-Hoo.Jeon-NOAA <60152248+chan-hoo@users.noreply.github.com>
Co-authored-by: gsketefian <31046882+gsketefian@users.noreply.github.com>
Co-authored-by: Jesse McFarland <jesse@mcfarland.sh>
SamuelTrahanNOAA pushed a commit to SamuelTrahanNOAA/ufs-srweather-app that referenced this pull request Sep 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants