Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make necessary bugfixes to get aerosol cycling going #1349

Merged
merged 15 commits into from
Mar 1, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Externals.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ protocol = git
required = False

[GDASApp]
hash = f7c23af
hash = a00b5da
local_path = sorc/gdas.cd
repo_url = https://github.com/NOAA-EMC/GDASApp.git
protocol = git
Expand Down
2 changes: 1 addition & 1 deletion parm/parm_gdas/aeroanl_inc_vars.yaml
Original file line number Diff line number Diff line change
@@ -1 +1 @@
['dust1', 'dust2', 'dust3', 'dust4', 'dust5', 'seas1', 'seas2', 'seas3', 'seas4', 'so4', 'oc1', 'oc2', 'bc1', 'bc2']
incvars: ['dust1', 'dust2', 'dust3', 'dust4', 'dust5', 'seas1', 'seas2', 'seas3', 'seas4', 'so4', 'oc1', 'oc2', 'bc1', 'bc2']
2 changes: 1 addition & 1 deletion sorc/checkout.sh
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@ if [[ ${checkout_gsi} == "YES" ]]; then
fi

if [[ ${checkout_gdas} == "YES" ]]; then
checkout "gdas.cd" "https://github.com/NOAA-EMC/GDASApp.git" "f7c23af"; errs=$((errs + $?))
checkout "gdas.cd" "https://github.com/NOAA-EMC/GDASApp.git" "a00b5da"; errs=$((errs + $?))
fi

if [[ ${checkout_gsi} == "YES" || ${checkout_gdas} == "YES" ]]; then
Expand Down
11 changes: 8 additions & 3 deletions ush/load_ufsda_modules.sh
Original file line number Diff line number Diff line change
Expand Up @@ -53,6 +53,14 @@ elif [[ -d /scratch1 ]] ; then
module load prod_util
elif [[ -d /work ]] ; then
# We are on MSU Orion
# prod_util stuff, find a better solution later...
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any estimate from the hpc-stack team when this problem will be resolved? It would be nice to remove this hack sooner rather than later.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agreed, it's not likely to happen until the UFS spack-stack and the JEDI spack-stack are one and the same...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK. Thanks for the timing sequence.

Copy link
Contributor

@aerorahul aerorahul Feb 28, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The stack unification will take a while till the GSI and GDASApp can be built and run w/ the same compiler version.

#module use /apps/contrib/NCEP/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/2022.1.2/
#module load prod_util
export UTILROOT=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2
export MDATE=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/mdate
export NDATE=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/ndate
export NHOUR=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/nhour
export FSYNC=/work2/noaa/da/python/opt/intel-2022.1.2/prod_util/1.2.2/bin/fsync_file
Comment on lines +60 to +63
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I ask what is using this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question, there were things that were failing due to set -e because these weren't defined... perhaps that isn't true anymore?
It might not actually be anything besides this:

export DBNROOT=${DBNROOT:-${UTILROOT}/fakedbn}

and I just defined everyting in the prod_util module for completeness sake

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why can't we just add prod_util to the UFSDA stack?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could, but it is maintained by JCSDA (Dom) and probably isn't worth it? Would there be a need to include these tools in the workflow scripts before JEDI is on WCOSS2?

module load "${MODS}/orion"
if [[ "${DEBUG_WORKFLOW:-NO}" == "YES" ]] ; then
module list
Expand All @@ -62,9 +70,6 @@ elif [[ -d /work ]] ; then
ncdump=$( which ncdump )
NETCDF=$( echo "${ncdump}" | cut -d " " -f 3 )
export NETCDF
# prod_util stuff, find a better solution later...
module use /apps/contrib/NCEP/hpc-stack/libs/hpc-stack/modulefiles/compiler/intel/2022.1.2/
module load prod_util
elif [[ -d /glade ]] ; then
# We are on NCAR Yellowstone
echo WARNING: UFSDA NOT SUPPORTED ON THIS PLATFORM
Expand Down
15 changes: 8 additions & 7 deletions ush/python/pygfs/task/aero_analysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
from pygw.timetools import to_isotime, to_fv3time, to_timedelta
from pygw.fsutils import rm_p
from pygw.template import Template, TemplateConstants
from pygw.timetools import to_fv3time
from pygw.yaml_file import YAMLFile
from pygw.logger import logit
from pygfs.task.analysis import Analysis
Expand Down Expand Up @@ -84,8 +85,8 @@ def initialize(self: Analysis) -> None:

# stage berror files
# copy BUMP files, otherwise it will assume ID matrix
if self.task_config.get('STATICB_TYPE', 'bump_aero') in ['bump_aero']:
FileHandler(AerosolAnalysis.get_berror_dict(self.task_config)).sync()
if self.task_config.get('STATICB_TYPE', 'identity') in ['bump']:
FileHandler(self.get_berror_dict(self.task_config)).sync()

# stage backgrounds
FileHandler(self.get_bkg_dict(AttrDict(self.task_config, **self.task_config))).sync()
Expand Down Expand Up @@ -148,8 +149,8 @@ def finalize(self: Analysis) -> None:
src = os.path.join(self.task_config['DATA'], f"{self.task_config['CDUMP']}.t{self.runtime_config['cyc']:02d}z.aerovar.yaml")
dest = os.path.join(self.task_config['COMOUTaero'], f"{self.task_config['CDUMP']}.t{self.runtime_config['cyc']:02d}z.aerovar.yaml")
yaml_copy = {
'mkdir': self.task_config['COMOUTaero'],
'copy': [src, dest]
'mkdir': [self.task_config['COMOUTaero']],
'copy': [[src, dest]]
}
FileHandler(yaml_copy).sync()

Expand Down Expand Up @@ -192,7 +193,7 @@ def _add_fms_cube_sphere_increments(self: Analysis) -> None:
fms_bkg_file_template = os.path.join(self.task_config.comin_ges_atm, 'RESTART', f'{self.task_config.cdate_fv3}.fv_tracer.res.tileX.nc')
# get list of increment vars
incvars_list_path = os.path.join(self.task_config['HOMEgfs'], 'parm', 'parm_gdas', 'aeroanl_inc_vars.yaml')
incvars = YAMLFile(path=incvars_list_path)
incvars = YAMLFile(path=incvars_list_path)['incvars']
super().add_fv3_increments(fms_inc_file_template, fms_bkg_file_template, incvars)

@logit(logger)
Expand Down Expand Up @@ -254,10 +255,10 @@ def get_berror_dict(self, config: Dict[str, Any]) -> Dict[str, List[str]]:
berror_dict: Dict
a dictionary containing the list of background error files to copy for FileHandler
"""
super.get_berror_dict(config)
super().get_berror_dict(config)
# aerosol static-B needs nicas, cor_rh, cor_rv and stddev files.
b_dir = config['BERROR_DATA_DIR']
b_datestr = config['BERROR_DATE']
b_datestr = to_fv3time(config['BERROR_DATE'])
berror_list = []

for ftype in ['cor_rh', 'cor_rv', 'stddev']:
Expand Down
5 changes: 3 additions & 2 deletions ush/python/pygfs/task/analysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,8 @@ class Analysis(Task):
"""

def __init__(self, config: Dict[str, Any]) -> None:
super().__init__(config, ntiles=6)
super().__init__(config)
self.config.ntiles = 6

def initialize(self) -> None:
super().initialize()
Expand Down Expand Up @@ -85,7 +86,7 @@ def add_fv3_increments(self, inc_file_tmpl: str, bkg_file_tmpl: str, incvars: Li
rstfile.variables[vname][:] = anl[:]
try:
rstfile.variables[vname].delncattr('checksum') # remove the checksum so fv3 does not complain
except AttributeError:
except (AttributeError, RuntimeError):
pass # checksum is missing, move on

@logit(logger)
Expand Down
3 changes: 1 addition & 2 deletions workflow/rocoto/workflow_tasks.py
Original file line number Diff line number Diff line change
Expand Up @@ -463,8 +463,7 @@ def aeroanlinit(self):
data = f'&ROTDIR;/gdas.@Y@m@d/@H/atmos/gdas.t@Hz.atmf009.nc'
dep_dict = {'type': 'data', 'data': data, 'offset': '-06:00:00'}
deps.append(rocoto.add_dependency(dep_dict))
data = f'{dmpdir}/{self.cdump}{dump_suffix}.@Y@m@d/@H/atmos/{self.cdump}.t@Hz.updated.status.tm00.bufr_d'
dep_dict = {'type': 'data', 'data': data}
dep_dict = {'type': 'task', 'name': f'{self.cdump}prep'}
deps.append(rocoto.add_dependency(dep_dict))
dependencies = rocoto.create_dependency(dep_condition='and', dep=deps)

Expand Down