Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix longname conflicts #35

Merged
merged 7 commits into from
Jan 10, 2018
Merged

Fix longname conflicts #35

merged 7 commits into from
Jan 10, 2018

Conversation

climbfuji
Copy link
Collaborator

This PR fixes all long name conflicts after the merge of the separated physics schemes. It also addresses a few bugs found in the code after the merge and renames the newly created variables initial_{t,qv,...} to save_{t,qv,...} to better reflect the intended usage of these variables.

The results are bit-for-bit identical with previous versions of the code (tested on Theia).

@climbfuji
Copy link
Collaborator Author

See http://www.dogtoytough.net/noaa/longname_checker/conflicts.html - all conflicts fixed.

@grantfirl
Copy link
Collaborator

I'm quite confused by what has happened with the various rainfall variables throughout the code. In the original GFS_physics_driver, the following happens:

  1. rain1 is declared locally
  2. rain1 gets passed into a deep convection scheme where it is first zeroed out then used to hold the deep convective rainfall (in m) for the current physics timestep.
  3. after the call to deep convection, the value of rain1 is used to calculate the deep convective rainfall for the dynamics timestep and stored in Diag%rainc.
  4. rain1 gets passed into shallow convection where it is zeroed out and then used to hold the shallow convective rainfall (in m) for the current physics timestep.
  5. after the call to shallow convection, the value of rain1 is used to calculate the shallow convective rainfall for the dynamics timestep and is added to Diag%rainc to form the total (deep + shallow) convective rainfall on the dynamics timestep.
  6. Diag%rainc is used as input (not modified) in cnvc90
  7. rain1 gets passed into precpd and zeroed out and then used to hold the rainfall due to microphysics (gridscale rain) (in m) for the current physics timestep.
  8. after microphysics, rain1 gets used to calculate Diag%rain (total rainfall = convective + gridscale) in one dynamics time step.
  9. Diag%rain is used as input in calpreciptype
  10. Diag%rain is used to accumulate total rainfall across many timesteps by adding to Diag%totprcp.
  11. Diag%rain is used in sfcprop%tprcp

It seems to me that we need the following variables (or similar):

  1. rainfall_due_to_deep_convection_on_physics_timestep
  2. rainfall_due_to_total_convection_on_dynamics_timestep (currently DIag%rainc)
  3. rainfall_due_to_shallow_convection_on_physics_timestep
  4. rainfall_due_to_microphysics_on_physics_timestep
  5. rainfall_due_to_all_physics_on_dynamics_timestep (currently Diag%rain)
  6. time_accumulated_rainfall_due_to_all_physics (currently Diag%totprcp)
    (7). Is there any reason to keep sfcprop%tprcp since it is the same as Diag%rain? It is made non-negative in calpreciptype_run, but other than that, it is identical. I wonder why it is saved separately?

@climbfuji
Copy link
Collaborator Author

climbfuji commented Jan 9, 2018

Good point, Grant. The entire thing is a mess. Right now, we still get identical results, so what we have been doing thus far in the code of this PR seems to be ok. But I totally agree that instead of reusing rain1 for different purposes all the time, we should have exactly one variable for each purpose.

EDIT: I am just thinking to finish the longname conflicts and metadata-mapping first, and then take care of these. As long as we still get bfb results, we know that in principle things work they way the did beforehand?

@ligiabernardet
Copy link
Collaborator

ligiabernardet commented Jan 9, 2018 via email

@climbfuji
Copy link
Collaborator Author

By creating interstitial variables, and quite a few of them, we do increase the memory footprint of the application. However, this is not necessarily bad for the performance, since in the original code, local variables are defined/allocated each time GFS_physics_driver or GFS_radiation_driver is run, and deleted when these drivers are left. In our code, the arrays stay alive. It will be interesting to see if we need to create another interstitial routine to zero out interstitial fields that would have been created otherwise. Maybe not, depends on whether the compiler did initialize everything to zero by default or not.

@grantfirl
Copy link
Collaborator

Here are some other questions regarding this PR:

  1. I noticed that save_qcw was added to DNCV_generic_pre. Previously, there was no need to "save" cloud water in this routine. What changed to necesitate this?
  2. I noticed that there were many changes to variable standard names having to do with vertical position, namely "layer" being added to many variables within the radiation schemes. In addition, "level" and "interface" still appear to be used interchangeably. Does there need to be a comprehensive check of this kind of thing? I was under the impression that our guiding principle was to assume that all variables that have vertical extent are assumed to be located at grid centers, what we're calling "layer", and thus we don't explicitly need to have "layer" in the variable name. For those variables that have vertical extent that are located on vertical grid interfaces, we were going to add "interface" to the variable name. Where a scheme uses a different vertical arrangement from the host model, this information would need to be part of the variable names too -- e.g. some variables in radiation using a different (or extended?) vertical grid.

Should we go ahead and accept this PR and address the rainfall and other issues afterward?

@climbfuji
Copy link
Collaborator Author

Ligia, Man and I sat down together one afternoon and decided on the longname with conflicts. Almost all the time, we went for a longname that was already suggested by one of the schemes, i.e. the "layer" and "level" and "interface" decorators were already in there. I find it useful to be able to distinguish between cell centers and interfaces, but whether the term "layer" is such a good choice I doubt. I would prefer having "level" and "interface". We can fix all this later on. While this PR is sitting here, I have piled up a large number of commits on my machine already. As I am going ahead with the metadata tables, I am finding quite a few things to change (obvious bugs were the wrong variable is passed in, but also just metadata-related issues such as the wrong rank, unit or type).

@grantfirl
Copy link
Collaborator

OK, thanks for the info. We had a discussion several months ago regarding "layer", "level", and "interface". Interestingly, there were many folks in the room for whom "level" was synonymous with "interface" (although "level" to me means grid cell center). So, it seems that both "layer" and "level" are ambiguous and may not immediately convey that the variable is in the center (vertically) of the grid cell. Perhaps that is one reason why we were attempting to use the convention that unless the variable name said otherwise, the variable is assumed to be located in the grid cell centers. This also jives with the CF conventions, which tend not to use this kind of specifier. I still think that calling out which variables are located on grid cell vertical interfaces by using the "interface" identifier is important, so there is definitely agreement there.

So, what you're saying is that we should just go ahead and approve this then and continue to change longnames from this point forward in future PRs? I'm happy to do so.

@climbfuji
Copy link
Collaborator Author

@grantfirl I think this would be best. There is so much more stuff to clean up afterwards anyway. As long as each PR still produces bfb results and doesn't introduce absolute nonsense I would advocate to go ahead. If you suggest changes right away, I will include them in the next round of longname conflict PRs.

@grantfirl
Copy link
Collaborator

OK, sounds good.

Copy link
Collaborator

@grantfirl grantfirl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approved with potential issues called out to be addressed in future PRs.

@climbfuji climbfuji merged commit 4182c15 into NCAR:master Jan 10, 2018
@climbfuji climbfuji deleted the fix_longname_conflicts branch January 11, 2018 16:18
hannahcbarnes pushed a commit to hannahcbarnes/ccpp-physics that referenced this pull request Apr 7, 2020
* fv3atm issue NCAR#37: fix the real(8) lat/lon in netcdf file
* fv3atm NCAR#35: Reducing background vertical diffusivities in the inversion layers
* fv3atm NCAR#24: bug in gfsphysics/physics/moninedmf_hafs.f
* fv3atm NCAR#18: Optimize netcdf write component and bugfix for post and samfdeepcnv.f
* set (0-1) bounds for ficein_cpl
* remove cache_size due to lower netcdf verion 4.5.1 on mars
* Change ice falling to 0.9 in gfsphysics/physics/gfdl_cloud_microphys.F90
climbfuji pushed a commit to climbfuji/ccpp-physics that referenced this pull request Jun 5, 2020
Cleanup of radiation (clouds) - microphysics interaction (based on NCAR#33)
hannahcbarnes pushed a commit to hannahcbarnes/ccpp-physics that referenced this pull request Aug 3, 2022
* update post lib to upp lib and add dzmin change in fv3 dycore
* add dycore change NCAR#35
* merge with top of dev/emc dycore branch
* remove duplicate read_data in fms_mod in external_ic.F90
SamuelTrahanNOAA pushed a commit to SamuelTrahanNOAA/ccpp-physics that referenced this pull request Jan 31, 2023
Bug fix for cloud effective radius for convective clouds (HR1)
dustinswales pushed a commit to dustinswales/ccpp-physics that referenced this pull request Feb 9, 2023
Bug fix for cloud effective radius for convective clouds (HR1)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants