Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More marine DA j-jobs #1270

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
45 changes: 45 additions & 0 deletions jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_BMAT_VRFY
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
#!/bin/bash
export STRICT="NO"
source "${HOMEgfs}/ush/preamble.sh"


export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason why every DATA is ${RUN}ocnanl_${cyc}?
We need this solely if it will need to be preserved after this job is completed.
If it is not needed, then we should stick with the default ${job}.${jobid} that are defined upstream.

This comment applies to the JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, at least for now it needs to be preserved.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This directory is being gutted here:

[[ "${KEEPDATA}" = "NO" ]] && rm -rf "${DATA}"

I suggest the post job to copy what it needs to COM and create a working directory for this job in $DATAROOT

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, will look into it @aerorahul

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aerorahul , I created an issue here to cleanup how we keep what we need from the cycle. Happy to add the necessary changes to that PR, or to do it in a different PR.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To elaborate a bit further: $DATA should only contain temporary data during runtime and is wiped at the end of the job. Typically that means a random directory name is okay. With the prep/run/post paradigm, that doesn't work because RUN needs to know where PREP staged stuff, but prep/run/post should still be considered a single job for purposes of $DATA and deleted at the end of POST. Anything that is needed outside of that "one" job needs to get placed into $COM.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Understood @WalterKolczynski-NOAA , I'll push the modifications that address that issue (#1273 ) later today ... maybe.
I'll use the default $DATA (default is ${job}.${jobid}?) for the JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY, that job will use data saved in $COM. The rest of the marine DA j-jobs all depend on PREP, so I'll have to stick to the specified $DATA=${RUN}ocnanal_${cyc}.

source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalrun" -c "base ocnanal ocnanalrun"


##############################################
# Set variables used in the script
##############################################


##############################################
# Begin JOB SPECIFIC work
##############################################

export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/ocean}

###############################################################
# Run relevant script

EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_bmat_vrfy.sh}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will this script eventually move to g-w scripts?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, at some point in the future @RussTreadon-NOAA .

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. I understand the utility in keeping it in GDASApp for the time being.

${EXSCRIPT}
status=$?
[[ ${status} -ne 0 ]] && exit "${status}"

##############################################
# End JOB SPECIFIC work
##############################################

##############################################
# Final processing
##############################################
if [[ -e "${pgmout}" ]] ; then
cat "${pgmout}"
fi

##########################################
# Do not remove the Temporary working directory (do this in POST)
##########################################
cd "${DATAROOT}" || exit 1

exit 0
38 changes: 37 additions & 1 deletion jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_POST
Original file line number Diff line number Diff line change
Expand Up @@ -6,13 +6,49 @@ DATA="${DATAROOT}/${RUN}ocnanal_${cyc}"
source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalpost" -c "base ocnanalpost"


##############################################
# Begin JOB SPECIFIC work
##############################################

export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/ocean}
export CDATE=${CDATE:-${PDY}${cyc}}

mkdir -p "${COMOUT}"

###############################################################
# Run relevant script
###############################################################

# Save some of the DA cycle output to COMOUT
# TODO: Move to a dedicated script

# Make a copy the IAU increment
cp "${DATA}/inc.nc" "${COMOUT}/${CDUMP}.t${cyc}z.ocninc.nc"

# TODO: Dump-splash of the sea-ice restart not done yet

# Copy of the ioda output files, as is for now
cp -r "${DATA}/diags" "${COMOUT}"

# Copy of the diagonal of the background error for the cycle
bdate=$(date -d "${CDATE:0:8} ${CDATE:8:2} - 3 hours" +"%Y-%m-%dT%H:00:00Z")
cp "${DATA}/ocn.bkgerr_stddev.incr.${bdate}.nc" "${COMOUT}/${CDUMP}.t${cyc}z.ocn.bkgerr_stddev.nc"
cp "${DATA}/ice.bkgerr_stddev.incr.${bdate}.nc" "${COMOUT}/${CDUMP}.t${cyc}z.ice.bkgerr_stddev.nc"

# Copy the loacalization and correlation operators
cp -rL "${DATA}/bump" "${COMOUT}/bump"

# Copy the analysis in the middle of the window
cdate=$(date -d "${CDATE:0:8} ${CDATE:8:2}" +"%Y-%m-%dT%H:00:00Z")
cp "${DATA}/Data/ocn.3dvarfgat_pseudo.an.${cdate}.nc" "${COMOUT}/${CDUMP}.t${cyc}z.ocnana.nc"

# TODO (#982)
# Copy DA grid (computed for the start of the window)
bcyc=$(((cyc - 3 + 24) % 24))
cp "${DATA}/soca_gridspec.nc" "${COMOUT}/${CDUMP}.t${bcyc}z.ocngrid.nc"

# Copy logs
mkdir -p "${COMOUT}/logs"
cp "${DATA}/*.out" "${COMOUT}/logs"

##########################################
# Remove the Temporary working directory
Expand Down
11 changes: 2 additions & 9 deletions jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_RUN
Original file line number Diff line number Diff line change
Expand Up @@ -2,22 +2,15 @@
export STRICT="NO"
source "${HOMEgfs}/ush/preamble.sh"
WIPE_DATA="NO"
DATA="${DATAROOT}/${RUN}ocnanal_${cyc}"
export DATA="${DATAROOT}/${RUN}ocnanal_${cyc}"
source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalrun" -c "base ocnanal ocnanalrun"


##############################################
# Set variables used in the script
##############################################

export CDUMP=${CDUMP:-${RUN:-"gfs"}}
export COMPONENT="ocean"

##############################################
# Begin JOB SPECIFIC work
##############################################

export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/${COMPONENT}}
export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/ocean}

###############################################################
# Run relevant script
Expand Down
45 changes: 45 additions & 0 deletions jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
#!/bin/bash
export STRICT="NO"
source "${HOMEgfs}/ush/preamble.sh"
source "${HOMEgfs}/ush/jjob_header.sh" -e "ocnanalprep" -c "base ocnanal ocnanalprep"


##############################################
# Set variables used in the script
##############################################


##############################################
# Begin JOB SPECIFIC work
##############################################

export COMOUT=${COMOUT:-${ROTDIR}/${CDUMP}.${PDY}/${cyc}/ocean}

# Add UFSDA to PYTHONPATH
export PYTHONPATH=${HOMEgfs}/sorc/gdas.cd/ush/:${PYTHONPATH}

###############################################################
# Run relevant script

EXSCRIPT=${GDASPREPPY:-${HOMEgfs}/sorc/gdas.cd/scripts/exgdas_global_marine_analysis_vrfy.py}
${EXSCRIPT}
status=$?
[[ ${status} -ne 0 ]] && exit "${status}"

##############################################
# End JOB SPECIFIC work
##############################################

##############################################
# Final processing
##############################################
if [[ -e "${pgmout}" ]] ; then
cat "${pgmout}"
fi

##########################################
# Do not remove the Temporary working directory (do this in POST)
##########################################
cd "${DATAROOT}" || exit 1

exit 0
8 changes: 4 additions & 4 deletions jobs/rocoto/ocnanalpost.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@
source "${HOMEgfs}/ush/preamble.sh"

###############################################################
# Source GDASApp modules
module purge
module use "${HOMEgfs}/sorc/gdas.cd/modulefiles"
module load GDAS/"${machine,,}"
# Source UFSDA workflow modules
. "${HOMEgfs}/ush/load_ufsda_modules.sh"
status=$?
[[ ${status} -ne 0 ]] && exit "${status}"

export job="ocnanalpost"
export jobid="${job}.$$"
Expand Down
9 changes: 5 additions & 4 deletions jobs/rocoto/ocnanalprep.sh
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
#! /usr/bin/env bash

export STRICT="NO"
source "${HOMEgfs}/ush/preamble.sh"

###############################################################
# Source GDASApp modules
module purge
module use "${HOMEgfs}/sorc/gdas.cd/modulefiles"
module load "GDAS/${machine,,}"
# Source UFSDA workflow modules
. "${HOMEgfs}/ush/load_ufsda_modules.sh"
status=$?
[[ ${status} -ne 0 ]] && exit "${status}"

export job="ocnanalprep"
export jobid="${job}.$$"
Expand Down
8 changes: 4 additions & 4 deletions jobs/rocoto/ocnanalrun.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@
source "${HOMEgfs}/ush/preamble.sh"

###############################################################
# Source GDASApp modules
module purge
module use "${HOMEgfs}/sorc/gdas.cd/modulefiles"
module load GDAS/"${machine,,}"
# Source UFSDA workflow modules
. "${HOMEgfs}/ush/load_ufsda_modules.sh"
status=$?
[[ ${status} -ne 0 ]] && exit "${status}"

export job="ocnanalrun"
export jobid="${job}.$$"
Expand Down
19 changes: 19 additions & 0 deletions jobs/rocoto/ocnanalvrfy.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
#! /usr/bin/env bash

export STRICT="NO"
source "${HOMEgfs}/ush/preamble.sh"

###############################################################
# Source UFSDA workflow modules
. "${HOMEgfs}/ush/load_ufsda_modules.sh" --eva
status=$?
[[ ${status} -ne 0 ]] && exit "${status}"

export job="ocnanalvrfy"
export jobid="${job}.$$"

###############################################################
# Execute the JJOB
"${HOMEgfs}/jobs/JGDAS_GLOBAL_OCEAN_ANALYSIS_VRFY"
status=$?
exit "${status}"
21 changes: 19 additions & 2 deletions ush/load_ufsda_modules.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,23 @@ if [[ "${DEBUG_WORKFLOW:-NO}" == "NO" ]]; then
set +x
fi

# Read optional module argument, default is to use GDAS
MODS="GDAS"
if [[ $# -gt 0 ]]; then
case "$1" in
--eva)
MODS="EVA"
;;
--gdas)
MODS="GDAS"
;;
*)
echo "Invalid option: $1" >&2
exit 1
;;
esac
fi

# Setup runtime environment by loading modules
ulimit_s=$( ulimit -S -s )

Expand All @@ -23,14 +40,14 @@ elif [[ -d /lfs3 ]] ; then
echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM
elif [[ -d /scratch1 ]] ; then
# We are on NOAA Hera
module load GDAS/hera
module load "${MODS}/hera"
if [[ "${DEBUG_WORKFLOW}" == "YES" ]] ; then
module list
pip list
fi
elif [[ -d /work ]] ; then
# We are on MSU Orion
module load GDAS/orion
module load "${MODS}/orion"
if [[ "${DEBUG_WORKFLOW}" == "YES" ]] ; then
module list
pip list
Expand Down