Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cycling on native grid and create JEDI-based analysis calc job #2949

Draft
wants to merge 80 commits into
base: develop
Choose a base branch
from

Conversation

DavidNew-NOAA
Copy link
Contributor

@DavidNew-NOAA DavidNew-NOAA commented Sep 23, 2024

Description

This PR is a companion to GDASApp PR #1293 and JCB-GDAS PR #28.

It does three things:

  1. The JEDI variational analysis job writes increments on the native cubed-sphere grid using the new FMS non-restart IO capability added to FV3-JEDI in #1251 (merged).
  2. UFS reads the native grid increment for the deterministic forecast using the new native grid increment feature added to atmos_cubed_sphere PR #342 (merged).
  3. A new job, analcalc_fv3jedi, calculates the variational analysis using a JEDI-based application created in the above GDASApp PR. The GSI analcalc job is renamed analcalc_gsi.

A few notes:

  1. The ensemble DA and forecast will continue to write and read increments on the Gaussian grid. This is because the ensemble recentering job read and writes Gaussian increments. When we create a JEDI-based recentering job, we can move to native-grid increments.
  2. In addition to writing the increments on the native grid, the variational DA job will now write them at full resolution rather than ensemble resolution. The reason for this is that UFS is not setup to interpolate native-grid increments to different resolutions. The original choice to write increments at ensemble resolution in GSI was due to resource constraints, but given that the JEDI DA system is still in development, it's probably better not to prematurely optimize and write at full resolution for now.
  3. To keep track of what which increments are on the Gaussian grid and which are on the native grid, I add a cubed_sphere_grid_ prefix to the atminc root used in the increment names. This suffix is already added the atmf and sfcf roots for the native-grid history files created by UFS, so this just makes things consistent. If everything is eventually on the native grid, this prefix could be removed.

Type of change

  • Bug fix (fixes something broken)
  • New feature (adds functionality)
  • Maintenance (code refactor, clean-up, new CI test, etc.)

Change characteristics

  • Is this a breaking change (a change in existing functionality)? YES
  • Does this change require a documentation update? NO
  • Does this change require an update to any of the following submodules? YES
    • EMC verif-global
    • GDAS #1293
    • GFS-utils
    • GSI
    • GSI-monitor
    • GSI-utils
    • UFS-utils
    • [] UFS-weather-model
    • wxflow

How has this been tested?

C96C48_ufs_hybatmDA CI run successfully
GDASApp atm jjob tests pass successfully

Checklist

  • Any dependent changes have been merged and published
  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have documented my code, including function, input, and output descriptions
  • My changes generate no new warnings
  • New and existing tests pass with my changes
  • This change is covered by an existing CI test or a new one has been added
  • I have made corresponding changes to the system documentation if necessary

jobs/rocoto/analcalc_fv3jedi.sh Fixed Show fixed Hide fixed
jobs/rocoto/analcalc_fv3jedi.sh Fixed Show fixed Hide fixed
jobs/rocoto/analcalc_fv3jedi.sh Fixed Show fixed Hide fixed
jobs/rocoto/analcalc_fv3jedi.sh Fixed Show fixed Hide fixed
jobs/rocoto/analcalc_gsi.sh Fixed Show fixed Hide fixed
parm/config/gfs/config.analcalc_gsi Fixed Show fixed Hide fixed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants