Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modifications to run workflow with unmodified executable names from each repository #304

Merged
merged 4 commits into from
Oct 9, 2020

Conversation

mkavulich
Copy link
Collaborator

@mkavulich mkavulich commented Oct 7, 2020

Note: This PR must be merged at the same time as SRW App PR#27

DESCRIPTION OF CHANGES:

The new top-level cmake build for the SRW App (SRW App PR#27) results in some executables having different names. This PR makes modifications that

  1. Allow the workflow to run successfully with the new cmake build and its different executable names, and
  2. Allow back-compatibility with the old build system to allow for a gradual transition to new build system

This PR also explicitly disallows running the workflow without CCPP, which we decided against supporting several months ago. I don't think the capability even works so this shouldn't effect anyone at this time.

TESTS CONDUCTED:

  • Cheyenne: Build and end-to-end test ("DOT_OR_USCORE" test case) was successful on Cheyenne with intel, both for the cmake build and the old build script (that will soon be deprecated). Path to tests: /glade/scratch/kavulich/UFS_CAM/testing/SRW_PR_27/expt_dirs/
  • Hera: Build and end-to-end tests successful (aside from expected failures). Path to tests: /scratch2/BMC/det/kavulich/workdir/SRW_PR_27/expt_dirs
  • Jet: Build test was successful. Machine is having disk errors that are preventing me from running the end-to-end tests.

ISSUE:

It was not the primary aim of this PR, but it does partially resolve #196

@mkavulich
Copy link
Collaborator Author

A question about backwards compatibility: in its current form, this PR would mean that the old method of building the workflow will no longer work: only the cmake top-level build will create the correct executable names. It would be a quick fix to the "install_all.sh" script to allow backward compatibility, is this something we want implemented?

@jwolff-ncar
Copy link
Contributor

jwolff-ncar commented Oct 7, 2020 via email

@mkavulich
Copy link
Collaborator Author

The library problem referenced in my PR message has been resolved (by a change in the other PR in ufs-srweather-app). Build and end-to-end test ("DOT_OR_USCORE" test case) was successful on Cheyenne with intel, both for the cmake build and the old build script (that will soon be deprecated). Hera build and end-to-end suite of tests are also successful now. Jet is awaiting some final testing that is being delayed due to some sort of disk issue; if I can't get these tests run by tomorrow afternoon I would propose merging anyway and fixing any problems afterwards....this code slush has dragged on too long!

Copy link

@JulieSchramm JulieSchramm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good.

exec_fp="${SR_WX_APP_TOP_DIR}/bin/${exec_fn}"
#Check for the old build location for fv3 executable
if [ ! -f "${exec_fp}" ]; then
if [ ! -f "${SR_WX_APP_TOP_DIR}/src/ufs_weather_model/build/${exec_fn}" ]; then
Copy link
Collaborator

@gsketefian gsketefian Oct 8, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be good to change the path
${SR_WX_APP_TOP_DIR}/src/ufs_weather_model
in this if-statement to ${UFS_WTHR_MDL_DIR} so there's less hard-coding.
Same thing on line 609 below.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Even better, to reduce repetition/hard-coding, I think this will work:

exec_fp="${SR_WX_APP_TOP_DIR}/bin/${exec_fn}"
#Check for the old build location for fv3 executable
if [ ! -f "${exec_fp}" ]; then
  exec_fp_alt="${UFS_WTHR_MDL_DIR}/build/${exec_fn}"
  if [ ! -f ${exec_fp_alt}" ]; then
    print_err_msg_exit "\
The executable (exec_fp) for running the forecast model does not exist:
  exec_fp = \"${exec_fp}\"
Please ensure that you've built this executable."
  else
    exec_fp="${exec_fp_alt}"
  fi
fi

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good suggestion, made that edit. Running one final end-to-end test and then will merge both PRs.

@mkavulich mkavulich merged commit d79ebc3 into ufs-community:develop Oct 9, 2020
mkavulich added a commit that referenced this pull request Oct 9, 2020
## DESCRIPTION OF CHANGES: 
Applying the changes necessary for the ufs-stweather-app cmake build to release branch. PR #304 took care of this for the develop branch (d79ebc3). This version omits the commits that were included in that PR for back-compatibility with old build system: the release branch will only support cmake builds going forward.

This PR also explicitly disallows running the workflow without CCPP, which we decided against supporting several months ago. I don't think the capability even works so this shouldn't effect anyone at this time.

## TESTS CONDUCTED: 
 - **Cheyenne**: Build and end-to-end test ("DOT_OR_USCORE" test case) was successful on Cheyenne with intel
 - **Hera**: Build and end-to-end test ("DOT_OR_USCORE" test case) was successful
gsketefian added a commit that referenced this pull request Oct 15, 2020
…ach repository (#315)

Modifications to run workflow with unmodified executable names from each repository (#304)

## DESCRIPTION OF CHANGES:
The new top-level cmake build for the SRW App ([SRW App PR#27](ufs-community/ufs-srweather-app#27)) results in some executables having different names. This PR makes modifications that
 1. Allow the workflow to run successfully with the new cmake build and its different executable names, and
 2. Allow back-compatibility with the old build system to allow for a gradual transition to new build system

This PR also explicitly disallows running the workflow without CCPP, which we decided against supporting several months ago. I don't think the capability even works so this shouldn't effect anyone at this time
.

## TESTS CONDUCTED:
 - **Cheyenne**: Build and end-to-end test ("DOT_OR_USCORE" test case) was successful on Cheyenne with intel, both for the cmake build and the old build script (that will soon be deprecated).
 - **Hera**: Build and end-to-end tests successful (aside from expected failures). Also built with old build script successfully.
 - **Jet**: Build test was successful.

## ISSUE:
It was not the primary aim of this PR, but it does partially resolve #196
christinaholtNOAA pushed a commit to christinaholtNOAA/regional_workflow that referenced this pull request Apr 27, 2022
RTMA changes, including changes required for wcoss
   turn on cloud analysis for RTMA_CONUS
   change the mesonet assimilation window to +/- 15 minutes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants