Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Having more flexible Cmake build system #416

Closed
uturuncoglu opened this issue Feb 11, 2021 · 48 comments
Closed

Having more flexible Cmake build system #416

uturuncoglu opened this issue Feb 11, 2021 · 48 comments
Labels
enhancement New feature or request

Comments

@uturuncoglu
Copy link
Collaborator

Description

I am currently working on integrating CDEPS data components with the model under HAFS application. This also requires extensive work related with the Cmake build system to build different data component configurations as well as adding PIO to support both CMEPS and CDEPS. The good news is that the initial version is ready in the following repository that supports DATM+DOCN and DATM+HYCOM configurations (FV3+DOCN will be available soon).

https://github.com/hafs-community/ufs-weather-model.git - feature/hafs_couplehycom_cdeps branch

While I was working with the Cmake build I had couple of issues related with the Cmake build system and I would like to bring them to your attention. I think that these needs to be addressed in the near future.

  • The current Cmake build system is not flexible enough to turn on/off individual model components and this requires lots of customization in the Cmake build system especially to support new data components provided by the CDEPS and other possible applications/components. One example for it is that FMS needs to build in all the cases even if I build CDEPS DATM+HYCOM case, that does not require FMS. Currently, there is no way to activate FMS only if FV3 or MOM6 are activated. Also, there is no way to turn on/off MOM6 and CICE explicitly such as MOM6=Y or CICE=Y. Those are just controlled by S2S build shortcut, which also makes the build more complicated. I think that those components also need to follow the same implementation way like WW3 and HYCOM. Since data components will be available soon and having multiple ocean components like MOM6, HYCOM, having flexible build system that allows to create different combinations of model components by turning on/off them easily by defining dependency between them will be crucial for the UFS model. This will simplify the top level Cmake file and also enable easy development and integration of other applications such as HAFS.

  • S2S shortcut is also creating another confusion because rather than tuning on CMPES, MOM6 and CICE explicitly. You could just set S2S=Y. This does not follow the current convention and requires additional customization when new dependencies and components will be available. One example of it is that i had problem related with supporting S2S related configurations by extending PIO support for CMEPS and CDEPS under the modeling system. It took extensive time for me to figure out that S2S also requires additional hack to make S2S RT test runs without any problem. The shortcuts like S2S can be still available but maybe the implementation logic can be revisited such as if you set S2S=Y it could trigger CMEPS=Y, MOM6=Y and CICE=Y rater then adding/activating them inside the S2S block or defining their dependencies such as FMS.

Solution

The easiest solution is to follow the same logic used in WW3 and HYCOM and allow explicit control for all components including FV3, FMS etc. This will allow to user build only required components and reduce the executable size and improve the model performance slightly.

Alternatives

N/A

Related to

N/A

@uturuncoglu uturuncoglu added the enhancement New feature or request label Feb 11, 2021
@aerorahul
Copy link
Contributor

@uturuncoglu
These are all valid points.
At the time these options S2S=ON, DATM=ON, etc were added, there were singular choices for coupler/mediator (CMEPS), ocean (MOM6), Sea-ice (CICE6). FMS is needed for all of these variations.

As the applications grow, more granularity in prescribing these configurations will be required, while also keeping track of viable, valid and commonly used applications. Convenience options e.g. S2S, DATM, etc. could be replaced with more informative/descriptive options. In the end, we would need to check for valid/absurd combinations e.g. -DMOM6=ON -DHYCOM=ON -DDATOCN=ON makes no sense in a single application.

@uturuncoglu
Copy link
Collaborator Author

@aerorahul yes. Also note that currently we are supporting both custom DATM and also DATM provided by the CDEPS. I think they will coexist until they tested side-by-side in a reasonable period. Eventually, CDEPS data components will replace the custom DATM.

@junwang-noaa
Copy link
Collaborator

junwang-noaa commented Feb 11, 2021 via email

@jedwards4b
Copy link

@junwang-noaa CESM has been successfully doing this with CIME for some time now. CESM has a large set of possible component models.

@junwang-noaa
Copy link
Collaborator

junwang-noaa commented Feb 12, 2021

@jedwards4b That's good to know. May I ask how CESM checks the validation of sub-component combinations?

@jedwards4b
Copy link

It is done using the concept of compsets https://esmci.github.io/cime/versions/master/html/users_guide/compsets.html
The user may attempt to build any combination of component models but if they try to define something outside the list of predefined compsets they are warned and required to add an additional flag to the create_newcase command.

@aerorahul
Copy link
Contributor

I think that is too complicated and we can do better. Besides, it requires use of an external system (CIME) as a dependency.
We can define valid combinations in a simple cmake file that is included in the UFS.

@uturuncoglu
Copy link
Collaborator Author

uturuncoglu commented Feb 12, 2021

@aerorahul I think that using compset approach is the way to go if you really want to have flexible modeling system. The unification of UFS Weather Model will force us to define lots of different configurations (or compsets) and the user community could also contribute to create new applications through the use of compsets. This is just an idea and you could use CIME as a reference to build you own system if you don't want to tight with CIME. I mean the approach that is used in CESM is well tested and used by the community for a while and it would be nice to inherit the idea from there rather than creating another way of managing configuration in a new way.

@aerorahul
Copy link
Contributor

@uturuncoglu
I am not opposed to the concept of a component set. I am reluctant to add a dependency that may or may not be entirely necessary or an over-kill for the job.

The user community by and large works on pre-defined applications (S2S, HAFS, etc) . A much smaller community add's new applications. An even smaller community adds new components. The operational community implements applications that undergo significant testing and evaluation of the components. And those big components within the applications seldom change. It is a balance.

A truly flexible system would build with all available components and trigger the desired ones at runtime via choices in the runtime configuration -- A Unified Forecast System that is capable of running all applications without the need for a rebuild and triggered via runtime configuration.

It would be useful however, to see the concepts implemented via the different mechanisms before reaching a decision.

My 2 cents.

@arunchawla-NOAA
Copy link

This discussion has petered out a bit. Can we define a few comp sets that will cover the range of applications we are working on for building ? I agree with @aerorahul we do not want to tie the build system to CIME but if there are combinations that we can put together it would be useful

@DusanJovic-NOAA
Copy link
Collaborator

Did #477 address this issue?

@junwang-noaa
Copy link
Collaborator

junwang-noaa commented Mar 24, 2021

@uturuncoglu would you please take a look at the latest repository (code changes in PR#477)? we are going to use the application compile options meanwhile give developers flexibility of defining their own applications. If you have further comments, please let us know. If not, we will close the ticket. Thanks

@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa okay. let me check and I 'll gets back to you soon. I have a meeting soon but I'll do it after that one.

@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa I think we need to modify it for the HAFS application. Right? As I see there is no HAFS application specific apps. Also, for different configurations we need to different apps such as fully coupled can be HAFS but FV3 coupled with HYCOM need to be HAFS_DOCN etc. It seems there is no any convention in terms of naming the apps but maybe I missed it. It would be nice to impose such convention to easily understand the configuration of the app from its name. For example, S2S does not imply anything related with the included component and it seems it is just an abstract definition.

@junwang-noaa
Copy link
Collaborator

@uturuncoglu First I want to confirm that the PR provides flexible way to define new applications. For the application name conventions, we need todiscuss with the associated modelling groups. The point is to only add fully supported HAFS applications in UFS. In the code side, just specify what components will be included in HAFS and HAFS_DOCN.

@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa thanks for clarification. Once this is synced with HAFS app, I'll look at it and if there is I'll fill the missing parts and let you know. In any case, you could still review the PR from HAFS to UFS model and we could adjust the implementation based on your review.

@junwang-noaa
Copy link
Collaborator

@uturuncoglu Do you have any update on this issue?

@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa i have no update about it since I still waiting for the CDEPS PR to update CDEPS branch under HAFS application.

@junwang-noaa
Copy link
Collaborator

@uturuncoglu I am curious, is the CDEPS required in current HAFS application? If not, you may need to add HAFS application compile option before you add another application compile option for CDEPS. According to Bin Liu, the HAFS fork was merged with UFS on 3/31, you may take a look on the updated build system.

@uturuncoglu
Copy link
Collaborator Author

uturuncoglu commented Apr 12, 2021

@junwang-noaa CDEPS data atmosphere and ocean are already integrated with the HAFS workflow and all data configurations are defined under model level RT. We could try to sync the CDEPS branch with latest update related with the build but this will break the data component configurations unless I reimplement CDEPS in the build. If i sync then I need to modify the build system for CDEPS related configurations which is already done in UFS CDEPS PR. It will also bring extra changes to the build and system and will create another set of conflict. I think it is waste of time. So, my initial intention was to wait until UFS CDEPS PR merged but it seem it will take time and i still don't know when it will be ready. Maybe we made a wrong choice in the first place by waiting UFS CDEPS PR. Anyway, we have HAFS call today and we could discuss it there but my personal preference is to wait CDEPS PR in UFS level and not to waste of time with the build.

@junwang-noaa
Copy link
Collaborator

I see. I guess since we are using two different repositories, it adds another layer of dependencies. If we are using single repository, then HAFS applications can be utilized as other applications and the efforts of adding CDEPS in UFS can be merged.

@uturuncoglu
Copy link
Collaborator Author

Hi All,

I am writing again under this thread because I think we have some issue related with the app definitions in the model. I am currently working on syncing HAFS model level CDEPS branch with the latest version of the UFS Weather Model and I think that cmake/configure_apps.cmake file is not defined as generic at this point and needs to be changed.

If you look at the file carefully, you could see that when you activate DATM, it assumes CMEPS, FMS, MOM6 and CICE6 components will be activated by default. I also need to define slightly different DATM configuration under HAFS which uses different ocean model (HYCOM) without CICE6. So, I don't need FMS, MOM6 and CICE6 for DATM. I think the current DATM configuration is specific to S2S and it must go to S2S section with different naming such as S2SD etc. Then, I could do the same thing for HAFS like HAFSD. Also, there could be also different configurations of the data atmosphere for example one configure uses wave component and other not.

Anyway, I think we need to have consensus/convention in here in the definition of the APPs. I'll try to fix it in HAFS level and this could eventually go to UFS Whether Model along with data ocean support because i don't want to wait for another round of sync. So, please let me know what do you think. Maybe I am missing something here that you could see.

@uturuncoglu
Copy link
Collaborator Author

uturuncoglu commented May 20, 2021

Also, I think that data components needs to be treated like other components. So, DATM or DOCN can be activated individually. I designed like that in the HAFS CDEPS branch and it was working very nicely. So, maybe I need to make it available again under new build system.

BTW, the top level CMake file seems uses CDEPS to activate the DATM and if we want to activate the DOCN that will be a problem because there is no distinction between DATM and DOCN and both of them is defined as CDEPS like following

if(CDEPS)
  add_dependencies(ufs cdeps::datm)
  list(APPEND _ufs_defs_private CDEPS-interface/CDEPS
                                FRONT_CDEPS_DATM=atm_comp_nuopc)
  include_directories(${CMAKE_CURRENT_BINARY_DIR}/CDEPS-interface/CDEPS/datm)
  target_link_libraries(ufs PUBLIC cdeps::datm)
endif()

one easy solution could be adding and other definitions for each data components to if statement but this will add extra data component to the extecutable which is not used by all the configurations.

@junwang-noaa
Copy link
Collaborator

@uturuncoglu Thanks for pointing out the confusion of DATM application. As you see, "DATM" app means the application used for GODAS, which contains CDEPS:DATM/CMEPS/MOM6/CICE6/FMS subcomponents, it does not mean the CDEPS DATM component. The application structure in cmake/configure_apps.cmake is generic to handle this application. To avoid confusion, we can consider using GODAS to replace DATM and GODAS_NEMSDATM to replace current NEMS_DATM as app name. Also, S2S/S2SW apps contains FV3/MOM6/CICE6/CMEPS/FMS, which has FV3 atmosphere component. I think it's better not to use "S2SD" to avoid confusion. On hafs side, you can use "HAFSD" as app name if everybody knows what it is.

So far we are using CDEPS as a sub-component within UFS, not the "CDEPS-DATM". My understanding is that the CDEPS:DATM and "CDEPS:DOCN" are independent within CDEPS, if you need DOCN component, you can add docn in the CDEPS section listed above. It won't break any current applications UFS supports, please let me know if this is not the case.

@uturuncoglu
Copy link
Collaborator Author

@junwang-noaa thanks for the clarification. It is still not clear to me to make distinction between DATM and DOCN since the data model related build requirements are only controlled by CDEPS flag. In my previous implementation I was using CDEPS as well as DATM and DOCN flags to make those distinction. Anyway, I'll try to find a workaround about it and make required changes in the build if I need it. Eventually, we will have a PR in UFS level and you could review the changes.

@junwang-noaa
Copy link
Collaborator

@uturuncoglu My understanding is that if any UFS application needs CDEPS-DOCN for HAFSD, we will compile it in UFS when CDEPS is ON. Every component has many features, but these features are only used by certain applications, not by all the UFS applications, but we still compile the component with these features. I think we can compile CDEPS with datm and docn for GODAS (DATM) application, please let us know if you think there is any conflict.

@aerorahul
Copy link
Contributor

aerorahul commented May 24, 2021

@uturuncoglu

When we constructed the application specific target names (-DAPP=) there was a single data component that linked with CDEPS and it was easiest to just link the UFS library with that data component (cdeps:datm).
I would recommend we link to the cdeps::cdeps library whenever a data component is required. cdeps::cdeps will be a library containing the entire CDEPS data components of datm, docn, dlnd, etc. Doing so, will allow us to build the model library and executable once and chose the configuration at runtime via the run sequence.

@uturuncoglu
Copy link
Collaborator Author

@aerorahul okay. if this is the way that you want to go, it is fine for me. Do we have libcdeps.a? I mean did you combine all libraries datm, docn etc. and create a single one? So, I could use cdeps::cdeps then.

@aerorahul
Copy link
Contributor

I have not built a unified cdeps cmake target.
I wanted to check with you and the other CDEPS developers if a runtime choice of the components is possible or not. If it is, I can make a quick patch and send it your way for testing and you can bring it with the HAFS app or we can issue a short clean PR in the UFS.

@uturuncoglu
Copy link
Collaborator Author

@aerorahul patch is fine. So, we could test it in HAFS level. Those changes will eventually go to UFS level. Thanks for your help.

@aerorahul
Copy link
Contributor

aerorahul commented May 24, 2021

@uturuncoglu
Please take a look and apply the following patch on develop at 6780244.
cdeps_diff.txt

I built it with the DATM application as follows:

cd tests/
./compile.sh orion.intel "-DAPP=DATM" 1 yes no

I did not run the test.

@uturuncoglu
Copy link
Collaborator Author

@aerorahul Thanks for quick response. This patch seems to include all data components when CDEPS is activated. That solves the problem. I think Orion will not be available tomorrow but I'll tested after that and let you know if I have any issue.

@uturuncoglu
Copy link
Collaborator Author

@aerorahul the patch works fine but I think there is an issue in compile.sh. If I create an APP with name "HAFS_DATM" and add following code into compile.sh it both match with HAFS and HAFS_DATM and add both of them to CMAKE_FLAGS. I think that string matching in this level confuses and needs to be revisited. Let me know what do you think?

if [[ "${MAKE_OPT}" == *"APP=HAFS"* ]]; then
    CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=HAFS"
fi

if [[ "${MAKE_OPT}" == *"APP=HAFS_DATM"* ]]; then
    CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=HAFS_DATM"
fi

@uturuncoglu
Copy link
Collaborator Author

I also tested with existing applications such as APP=HAFS_DATM and APP=HAFS. If you set APP=HAFS_DATM to compile model as ./compile.sh "$target" "APP=DATM_NEMS" 32bit YES NO then it adds both -DAPP=DATM -DAPP=DATM_NEMS to CMAKE_FLAGS. So, definitely there is a problem in this logic and compile.sh needs to be fixed. You could easily reproduce it by trying to compile DATM_NEMS app.

@uturuncoglu
Copy link
Collaborator Author

@aerorahul Maybe the issue is to have _ in the app name.

@uturuncoglu
Copy link
Collaborator Author

Here is the proposed solution for compile.sh. This isolates APP part and works for following cases,

  1. ./compile.sh "$target" "CCPP=Y APP=HAFS_DOCN STATIC=Y SUITES=HAFS_v0_gfdlmp_tedmf_nonsst,HAFS_v0_gfdlmp_tedmf,HAFS_v0_gfdlmp_nocpnsst,HAFS_v0_gfdlmp_nonsst,HAFS_v0_gfdlmp_nocp,HAFS_v0_gfdlmp,HAFS_v0_hwrf_thompson,HAFS_v0_hwrf 32BIT=Y" 32bit YES NO

  2. ./compile.sh "$target" "APP=HAFS_DOCN CCPP=Y STATIC=Y SUITES=HAFS_v0_gfdlmp_tedmf_nonsst,HAFS_v0_gfdlmp_tedmf,HAFS_v0_gfdlmp_nocpnsst,HAFS_v0_gfdlmp_nonsst,HAFS_v0_gfdlmp_nocp,HAFS_v0_gfdlmp,HAFS_v0_hwrf_thompson,HAFS_v0_hwrf 32BIT=Y" 32bit YES NO

So, order of APP definition does not affect the results and it match always the correct one. I'll push this changes into my HAFS CDEPS branch that could god eventually to upstream repository.

 # Valid applications
-if [[ "${MAKE_OPT}" == *"APP=ATM"* ]]; then
+APP_OPT=$( echo $MAKE_OPT | tr " " "\n" | sed -n '/APP/p;/APP/q' )
+echo "APP_OPT = ${APP_OPT}"
+
+if [[ "${APP_OPT}" == "APP=ATM" ]]; then
     echo "MAKE_OPT = ${MAKE_OPT}"
     CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=ATM"
 fi
 
-if [[ "${MAKE_OPT}" == *"APP=ATMW"* ]]; then
+if [[ "${APP_OPT}" == "APP=ATMW" ]]; then
     CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=ATMW"
 fi
 
-if [[ "${MAKE_OPT}" == *"APP=S2S"* ]]; then
+if [[ "${APP_OPT}" == "APP=S2S" ]]; then
     CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=S2S -DMOM6SOLO=ON"
 fi
 
-if [[ "${MAKE_OPT}" == *"APP=S2SW"* ]]; then
+if [[ "${APP_OPT}" == "APP=S2SW" ]]; then
     CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=S2SW -DMOM6SOLO=ON"
 fi
 
-if [[ "${MAKE_OPT}" == *"APP=HAFS"* ]]; then
+if [[ "${APP_OPT}" == "APP=HAFS" ]]; then
     CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=HAFS"
 fi
 
-if [[ "${MAKE_OPT}" == *"APP=HAFSW"* ]]; then
+if [[ "${APP_OPT}" == "APP=HAFSW" ]]; then
     CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=HAFSW"
 fi
 
-if [[ "${MAKE_OPT}" == *"APP=DATM"* ]]; then
+if [[ "${APP_OPT}" == "APP=HAFS_DATM" ]]; then
+    CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=HAFS_DATM"
+fi
+
+if [[ "${APP_OPT}" == "APP=HAFS_DOCN" ]]; then
+    CMAKE_FLAGS="${CMAKE_FLAGS} -DAPP=HAFS_DOCN"
+fi
+

@aerorahul
Copy link
Contributor

@uturuncoglu
Separating APP_OPT containing APP from MAKE_OPT is a good solution within compile.sh.
I will open a PR with my CDEPS library build changes to UFS-weather-model. The changes are and should be transparent to the users/developers.

@DusanJovic-NOAA
Copy link
Collaborator

This is too complicated. We should just update rt.conf to pass proper cmake flags (-DAPP=..... -D32BIT=ON) and avoid all this translations of flags from old gnu make style flags to cmake flags. None of this will be needed anymore.

@uturuncoglu
Copy link
Collaborator Author

@aerorahul Thanks. I think this is the right way to go at this point. At least for the short term. @DusanJovic-NOAA Of course the build system can be improved more but at least this fix allow me to continue to HAFS level CDEPS development.

@aerorahul
Copy link
Contributor

@DusanJovic-NOAA
rt.conf already has APP=APP_NAME. compile.sh takes in MAKE_OPT which is a collection of options. In order to make the change the way you suggest, a change in the call signature to compile.sh will be required.

@DusanJovic-NOAA
Copy link
Collaborator

DusanJovic-NOAA commented May 27, 2021

MAKE_OPT is whatever is in the second column in rt.conf. If all options are 'cmake options' (i.e. -DVAR=VALUE) compile.sh will just pass those to cmake (or build.sh in this case).

@aerorahul
Copy link
Contributor

Ok. I see what you are saying.
In that case, all MAKE_OPT values should be CMake options.
Instead of APP=ATM in rt.conf, it will be -DAPP=ATM.
In that case, we are bringing in CMake language into rt.conf. Is that acceptable?

@DusanJovic-NOAA
Copy link
Collaborator

Why not. compile.sh and build.sh already have the 'cmake language' logic. compile.sh will need to append few more cmake options to what's passed from rt.conf's second column (like -DSIMDMULTIARCH=ON), but that's the purpose of compile.sh to prepare and customize the build depending on platform.

@aerorahul
Copy link
Contributor

Fine by me. @uturuncoglu, can we use the proposal from @DusanJovic-NOAA. It is a bit more work on your end. I can assist!

@climbfuji
Copy link
Collaborator

Why not. compile.sh and build.sh already have the 'cmake language' logic. compile.sh will need to append few more cmake options to what's passed from rt.conf's second column (like -DSIMDMULTIARCH=ON), but that's the purpose of compile.sh to prepare and customize the build depending on platform.

I think that is a good approach, please update all rt*conf files in the repository when making those changes. Thanks ...

@uturuncoglu
Copy link
Collaborator Author

@aerorahul it is also fine for me. Just let me know what I need to do. I could revert my modifications. Thanks for your help.

@arunchawla-NOAA
Copy link

@uturuncoglu @aerorahul

can this issue be closed ? the PR linked to this issue is closed

@uturuncoglu
Copy link
Collaborator Author

uturuncoglu commented Mar 4, 2022

@arunchawla-NOAA I think the build system is more flexible now and allows defining different applications easily. So, I think we could close this issue.

epic-cicd-jenkins pushed a commit that referenced this issue Apr 17, 2023
## DESCRIPTION OF CHANGES:
### Main Changes:
* Add the new function check_ruc_lsm.sh that reads in the SDF to check whether it uses the RUC LSM.  If so, it sets the local variable sdf_uses_ruc_lsm to "TRUE", otherwise to "FALSE".  It then sets the environment variable whose name is specified by the input argument output_varname_sdf_uses_ruc_lsm to the value of sdf_uses_ruc_lsm.
* In setup.sh, introduce new workflow variable SDF_USES_RUC_LSM by passing its name as an argument to the new function check_ruc_lsm.sh.  This gets set to "TRUE" if the physics suite uses the RUC LSM and "FALSE" otherwise.
* In exregional_make_ics.sh, set the chgres_cube namelist variable nsoill_out (which is the number of soil levels to have in the NetCDF file that chgres_cube generates) by uisng an if-statement that checks the new workflow variable SDF_USES_RUC_LSM as well as the name of the external model used for ICs.  This if-statemnt replaces the settings of nsoill_out in the external-model-dependent case-statement and, depending on model, the physics-suite-dependent if-statements within that case statement.  This simplifies the script because we no longer need to remember which suites use the RUC LSM and which don't.
* In generate_FV3LAM_wflow.sh, set the number of ice levels in the FV3 input file according to whether the RUC_LSM is used in the physics suite (i.e. by checking the new variable SDF_USES_RUC_LSM).

### Improvements:
* Change the name of the workflow variable THOMPSON_MP_USED to SDF_USES_THOMPSON_MP to be consistent with the new varable SDF_USES_RUC_LSM.
* In set_thompson_mp_fix_files.sh, change the local variable thompson_mp_used to sdf_uses_thompson_mp (to be consistent with the change in the corresponding workflow variable SDF_USES_THOMPSON_MP).

## TESTS CONDUCTED:
Ran the following WE2E tests on Hera using the HEAD (hash #ea8a7aa) of the authoritative ufs-community/ufs-weather-model repo:

* grid_RRFS_CONUS_13km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15p2
* grid_RRFS_CONUS_13km_ics_HRRR_lbcs_RAP_suite_GSD_SAR_2020020800
* grid_RRFS_CONUS_13km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta
* grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GFS_v15p2
* grid_RRFS_CONUS_25km_ics_FV3GFS_lbcs_FV3GFS_suite_GSD_SAR
* grid_RRFS_CONUS_25km_ics_HRRR_lbcs_RAP_suite_GSD_SAR_2020020800
* grid_RRFS_CONUS_25km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta
* grid_RRFS_CONUS_25km_modify_DT_ATMOS_LAYOUT_XY_BLOCKSIZE
* grid_RRFS_CONUS_3km_ics_HRRR_lbcs_RAP_suite_GSD_SAR_2020020800
* grid_RRFS_CONUS_3km_ics_HRRR_lbcs_RAP_suite_RRFS_v1beta
* nco_CONUS_25km_GFDLgrid
* nco_RRFS_CONUS_25km_HRRR_RAP
* suite_FV3_GSD_v0

All tests were successful.  Note that the three tests whose names end with "_2020020800" are not in the workflow, but they are generated by taking the respective WE2E tests in the workflow with names without the trailing "_2020020800" and simply changing DATE_FIRST_CYCL and DATE_LAST_CYCL from 20200801 to 20200208 and LBC_SPEC_INTVL_HRS from 3 to 1.  The change in date was to try out a winter case, which was the situation under which failures were common (the setting of kice to 9 for suites that use the RUC LSM seems to have fixed this winter case problem).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
7 participants