-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Release/gfsda.v16.3.0 #456
Release/gfsda.v16.3.0 #456
Conversation
…in MinMon_config in lieu of prod_util definition (NOAA-EMC#137)
…ventional_Monitor for BUILD_UTIL_MON option (NOAA-EMC#137)
Parallel Test Given this result change this PR from draft to active. Tagging @emilyhcliu , @EdwardSafford-NOAA , @aerorahul , @dtkleist for awareness. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am ok approving these changes for v16.3.
Do we need to discuss w/ NCO if these changes/fixes to bugzilla are acceptable after code delivery?
I can't imagine why they would not.
Are the utilities and monitoring changes also tested in the short parallel?
Also, these same changes should be pulled into develop of the respective repos when possible.
Thanks for addressing the Bugzilla issues.
MinMon output was not saved due to path error. Let me correct this and rerun MinMon. OznMon and RadMon successfully ran to completion. Since this was a cycled parallel from 2021110806 through 2021110900 all jobs in the g-w rocoto workflow were executed except the following:
DA apps executed in the EMC retro1-v16-ecf are executed in the rocoto workflow. This includes DA utilities (e.g., |
@RussTreadon-NOAA @aerorahul I am reviewing the changes. |
@RussTreadon-NOAA @aerorahul , and @KateFriedman-NOAA |
DA Monitor output After correcting the path error for MinMon, reran control and test MinMon jobs. Output from the two parallels is identical. Comparison of OznMon and RadMon output is more involved given that output is saved in compressed tarballs. Uncompress and untar output. Control and test OznMon output is identical. This statement is NOT true for RadMon output. Not all test and control RadMon output is identical. Two types of difference explained below were found.
whereas the test has
Element
The same change was made to bcor.f90. This explains differences between control and test angle and bcor RadMon output. The question to be answered is "Which output is correct?" The answer to this question does NOT impact cycled results. Impact is limited to RadMon |
@RussTreadon-NOAA @aerorahul Do you have any other changes you want to make for this PR? I reviewed the changes and they looked good to me. Can I go ahead and merge this PR to the authoritative release/gfsda.v16.3.0? |
@RussTreadon-NOAA Find one more fix to add for the radiance monitoring: gdas_radmon_satype.txt |
Great, please correct as appropriate. Long term we should see if it is possible for RadMon to use |
RadMon test As a test, undo the Print statements added to |
No additional changes to |
|
The change to |
@RussTreadon-NOAA I found that we do not have IASI MetOp-C in enkf update script. IASI MetOp-C is assimilated in operational GSI. The iasi metop-c is included in v16.3.0 enkf update script. We do need to find a way to have multiple entries for the same information. |
Another thing I found is that we do not have iasi metop-c added to the enkf update script
You do not have change for gdas_radmon_sattype.txt. Your changes is in gdas_radmon_scaninfo.txt. But, I see your point, this PR is for bugzilla fixes. I will fix the gdas_radmon_sattype.txt in another PR. |
OK. My mistake in not reading carefully your comment. Your call as to whether or not to include the |
@emilyhcliu , I do not understand your enkf_update comment with regards to GSI/scripts/exgdas_enkf_update.sh Line 352 in 7e174ec
A check of enkfgdas.20211129/12/atmos/gdas.t12z.enkfstat from the retro1-v16-ecf shows
and
What am I missing? Do your |
I checked /lfs/h1/ops/prod/packages/gfs.v16.2.2/scripts/exgdas_enkf_update.sh. |
Is it ok to fix gdas_radmon_sattype.txt in this PR? (adding avhrr_metop-a and avhrr_metop-c) |
Additional Run 2021110900 gdas with forked
The line in question from
We do not have FL_HDOB data every cycle. The 2021110806 cycle did not have any FL_HDOB data so this out of bounds array condition was not found. Logic needs to be added to Tagging @ilianagenkova for awareness. |
The following change to
The Integer I'm not comfortable in calling this a solution. It's more accurately an engineered way around an out of bounds array. |
Not writing a missing value for
Given this, add back the
section. What changes, if any, do we want to make to Thoughts @HaixiaLiu-NOAA , @ilianagenkova , and @emilyhcliu ? |
@RussTreadon-NOAA, I see this only now, we already emailed about it - see #463 Mitigation_flag_AMVQ is an odd variable - it was needed in preparation for the G-17 mitigated AMVs product (to address LoopHeatPipe issue), but now NESDIS leans towards switching to GOEA-18 AMVs product and abandoning the intermediate "mitigated AMVs" product. If/when that happens , the Mitigation_flag_AMVQ could be pulled out of GSI, so it won't add t diag file size unnecessarily. When adding Mitigation_flag_AMVQ to GSI, I did wonder if we should add it to the binary diag or to the netcdf diag only, or both. I anded up adding it to both, and now it's only causing problems. |
I think there are two options: |
Thanks, @emilyhcliu , for your comments.
|
@emilyhcliu is correct, the flag is not used in the assimilation process. It was used for diagnostics only (i.e. to decide if we need to alter the QC for the mitigated winds). The conclusion was that there is no need for special QC steps for the mitigated winds. And recent communication with NESDIS tells that the mitigation approach will not be implemented. |
@ilianagenkova , is your recommendation that we remove |
@RussTreadon-NOAA Given the timeline of v16.3 implementation and NESDIS plans, yes, that's a reasonable move forward. Just to be clear, I only increased nreal for uv in read_satwnd aand read_prepbufr. |
@ilianagenkova , @HaixiaLiu-NOAA , @emilyhcliu |
@RussTreadon-NOAA |
Thank you, @MichaelLueken-NOAA ! |
Using PR #294 as a guide, changes related to the addition of |
The modified While this is the result we expect and hope to see, it is disturbing that building and running with |
Revert
Recompile with The minimization from the second run differs from the first run. The 1st run
2nd run
Apparently compiling with Given that repeated runs of the production build reproduce one another as well as reproduce the unaltered code, we can say that in production mode the proposed changes to |
Rerun 2021110900 enkf, gdas, and gfs using forked Analysis results reproduce control. Conventional diagnostic files differ between test and control because 955dae3 removes Tagging @ilianagenkova and @emilyhcliu for awareness. |
I recommend modifying g-w To prune
|
Discover that repeated executions of Note: If one wants to restore pruned files and directories, execute |
The forked
from PR #464 were included in the local copy of An operational resolution parallel was cycled from 2021110806 through 2021110900. Analysis and forecast files from the The changes in PR #456 are ready to be merged into the authoritative |
@RussTreadon-NOAA I will merge this PR to authoritative release/gfsda.v16.3.0 first, and then PR #464. One question. |
Unfortunately the answer is no. Several of the bugzillas refer to the GFS, not just the DA component of the GFS. I addressed the DA piece of those bugzillas. This is not sufficient to close these bugzillas. A few bugzillas are DA specific. NCO, EIB, and the GFS v16.3.0 DA POC should discuss these to see if the response is sufficient for NCO to close these bugzillas. Tomorrow (8/29) I will add a table to the end of #137 separating GFS and DA only buzillas. |
A review of GFS v16 bugzillas in issue #137 resulted in several updates to code, scripts, and fix files in
release/gfsda.v16.3.0
. The authoritativerelease/gfsda.v16.3.0
was forked into Russ.Treadon-NOAA/GSI. Bugzilla updates have been committed to the forkedrelease/gfsda.v16.3.0
. These changes are detailed in #137.Merger of this PR the authoritative
release/gfsda.v16.3.0
closes issue #137. Closure of issue #137 does not imply closure of the associated NCO bugzillas. EIB and the GFS v16.3.0 DA POC should follow up with NCO to see which GFS bugzillas may be closed.