-
Notifications
You must be signed in to change notification settings - Fork 120
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Include documentation for building/running SRW App on Mac #240
Changes from 105 commits
370bb6c
a907487
b628dd6
467071f
de00e4c
fb34100
701f9e9
cab1c6f
290364e
80291a1
ab97b74
8056200
17504fc
357e151
acf555b
36349a6
838271f
863b7de
2b100d9
9e58e67
1830b49
54a647e
46d381f
91af03d
97616fd
21d3e27
5af69e5
ee901e6
f77cba9
bc0748c
fef6d27
0d16101
418a40b
77d565d
2e1a03f
80519d4
6f11030
999a417
f07fe8a
14db051
b58d661
301ff5f
0b50e04
8786b32
3a442d6
0cae160
27247a5
805bb81
29cf292
3e30098
53807fa
93bfe9b
eb00397
f4d2043
d1addf8
fc1a1d4
acb77c8
70a051b
99127e7
b01268d
da35184
1302868
a704a2f
7fc263d
496fcb3
10de71f
16b0c1a
e294020
92bddca
698613b
6fa5074
a5ae76e
ea17b19
1aa9322
3d1cddb
173b838
09581c8
a714d43
4757b40
2c68823
a9702bb
9d9e079
091574a
a5fced9
950a4b7
713c546
8bc2f6d
2d62840
b33d630
67dbcf9
f9adea1
fb014c3
deada14
376fe7a
c8b63dc
1b4135f
7a61473
535c5c3
ccd9e21
c742c09
8cf44d2
d1267a2
a4af2ed
528fa72
25ccf68
744f1bb
2b48a39
cb1a6bc
b4ce20c
21a9975
2c32a67
fd58cef
956a7fa
3cee7ef
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -72,7 +72,11 @@ The SRW Application source code is publicly available on GitHub. To download the | |
COMMENT: This will need to be changed to the updated release branch of the SRW repo once it exists. | ||
|
||
The cloned repository contains the configuration files and sub-directories shown in | ||
:numref:`Table %s <FilesAndSubDirs>`. | ||
:numref:`Table %s <FilesAndSubDirs>`. The user may set an ``$SRW`` environmental variable to point to the location of the new ``ufs-srweather-app`` repository. For example, if ``ufs-srweather-app`` was cloned into the $HOME directory: | ||
|
||
.. code-block:: console | ||
|
||
export SRW=$HOME/ufs-srweather-app | ||
|
||
.. _FilesAndSubDirs: | ||
|
||
|
@@ -120,15 +124,14 @@ Run the executable that pulls in SRW App components from external repositories: | |
|
||
.. code-block:: console | ||
|
||
cd ufs-srweather-app | ||
cd $SRW | ||
./manage_externals/checkout_externals | ||
|
||
|
||
|
||
Build with ``devbuild.sh`` | ||
========================== | ||
|
||
On Level-1 systems, for which a modulefile is provided under ``modulefiles`` directory, we can build SRW App binaries with: | ||
On Level 1 systems, a modulefile is provided under ``modulefiles`` directory, which users can run to build the SRW App binaries: | ||
|
||
.. code-block:: console | ||
|
||
|
@@ -140,46 +143,114 @@ If compiler auto-detection fails for some reason, specify it using | |
|
||
./devbuild.sh --platform=hera --compiler=intel | ||
|
||
If this method doesn't work, we will have to manually setup the environment, and build SRW app binaries with CMake. | ||
If this method doesn't work, users will have to manually setup the environment and build the SRW App binaries with CMake. | ||
|
||
.. | ||
COMMENT: What would this entail?! | ||
|
||
.. _SetUpBuild: | ||
|
||
Set up the Build/Run Environment | ||
================================ | ||
|
||
We need to setup our environment to run a workflow or to build the SRW app with CMake. Note that ``devbuild.sh`` does not prepare environment for workflow runs so this step is necessary even though binaries are built properly using ``devbuild.sh``. | ||
Before building the SRW App, users must set up their environment to run a workflow or to build the SRW App with CMake. Note that ``devbuild.sh`` does not prepare environment for workflow runs, so this step is necessary even though binaries are built properly using ``devbuild.sh``. | ||
|
||
The build environment must be set up for the user's specific platform. First, we need to make sure ``Lmod`` is the app used for loading modulefiles. That is often the case on most systems, however, on some systems such as Gaea/Odin, the default modulefile loader is from Cray and we need to swap it for ``Lmod``. For example on Gaea, assuming a ``bash`` login shell, run: | ||
|
||
.. code-block:: console | ||
|
||
source etc/lmod-setup.sh gaea | ||
|
||
or if your login shell is ``csh`` or ``tcsh``, source ``etc/lmod-setup.csh`` instead. If you execute the above command on systems that don't need it, it will simply do a ``module purge``. From here on, we can assume, ``Lmod`` is ready to load modulefiles needed by the SRW app. | ||
or if your login shell is ``csh`` or ``tcsh``, source ``etc/lmod-setup.csh`` instead. If you execute the above command on systems that don't need it, it will simply do a ``module purge``. From here on, we can assume that ``Lmod`` is ready to load modulefiles needed by the SRW App. | ||
|
||
The modulefiles needed for building and running SRW App are located in ``modulefiles`` directory. To load the necessary modulefile for a specific ``<platform>`` using ``<compiler>`` , run: | ||
|
||
.. code-block:: console | ||
|
||
module use <path/to/modulefiles/directory> | ||
module use <path/to/ufs-srweather-app/modulefiles> | ||
module load build_<platform>_<compiler> | ||
|
||
where ``<path/to/modulefiles/directory>`` is the full path to the ``modulefiles`` directory. This will work on Level 1 systems, where a modulefile is available in the ``modulefiles`` directory. | ||
|
||
On Level 2-4 systems, users will need to modify certain environment variables, such as the path to NCEP libraries, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build_<platform>_<compiler>`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively: | ||
On Level 2-4 systems, users will need to modify certain environment variables, such as the path to HPC-Stack, so that the SRW App can find and load the appropriate modules. For systems with Lmod installed, one of the current ``build_<platform>_<compiler>`` modulefiles can be copied and used as a template. To check whether Lmod is installed, run ``echo $LMOD_PKG``, and see if it outputs a path to the Lmod package. On systems without Lmod, users can modify or set the required environment variables with the ``export`` or ``setenv`` commands despending on whether they are using a bash or csh/tcsh shell, respectively: | ||
|
||
.. code-block:: | ||
|
||
export <VARIABLE_NAME>=<PATH_TO_MODULE> | ||
setenv <VARIABLE_NAME> <PATH_TO_MODULE> | ||
|
||
.. _MacDetails: | ||
|
||
Additional Details for Building on MacOS | ||
------------------------------------------ | ||
|
||
.. note:: | ||
Users not building the SRW App to run on MacOS may skip to the :ref:`next section <BuildExecutables>`. | ||
|
||
The SRW App can be built on MacOS systems, presuming HPC-Stack has already been successfully installed. The following two options have been tested: | ||
|
||
* **Option 1:** MacBookAir 2020, M1 chip (arm64, running natively), 4+4 cores, Big Sur 11.6.4, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed | ||
|
||
* **Option 2:** MacBook Pro 2015, 2.8 GHz Quad-Core Intel Core i7 (x86_64), Catalina OS X 10.15.7, GNU compiler suite v.11.2.0_3 (gcc, gfortran, g++); no MPI pre-installed | ||
|
||
.. note:: | ||
Examples in this subsection presume that the user is running Terminal.app with a bash shell environment. If this is not the case, users will need to adjust the commands to fit their command line application and shell environment. | ||
|
||
The ``build_macos_gnu`` modulefile initializes the module environment, lists the location of HPC-Stack modules, loads the meta-modules and modules, and sets compilers, additional flags, and environment variables needed for building the SRW App. The modulefile must be modified to include the absolute path to the user's HPC-Stack installation and ``ufs-srweather-app`` directories. In particular, the following section must be modified: | ||
|
||
.. code-block:: console | ||
|
||
# This path should point to your HPCstack installation directory | ||
setenv HPCstack "/Users/username/hpc-stack/install" | ||
|
||
# This path should point to your SRW Application directory | ||
setenv SRW "/Users/username/ufs-srweather-app" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. export SRW=/Users/username/ufs-srweather-app |
||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The syntax above and below is for the csh. Instructions for the MacOS followed use of bash shell, as is stated in the line 197 above! and the build_macos_gnu needs to use bash syntax to be consistent. The changes for the bash shell syntax to be made for the lines 204, 207, 214-216, 219-221, are shown after each corresponding line. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. One of the motivation for using modulefiles is to set the environment irrespective of the login shell used (bash or tcsh).
should work for bash even though it is written in "csh-like" syntax. The only place where login shell dependent code is needed is when loading Lmod itself i.e.
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @natalie-perlin This code is already in the modulefile, which is written in csh-like syntax. In reality, the user is only changing the path. If we change this to bash, we have to redo the entire modulefile, which I don't think is worth it since, as @danielabdi-noaa pointed out, the modulefile should work regardless of the user's shell. |
||
An excerpt of the ``build_macos_gnu`` contents appears below for Option 1. To use Option 2, the user will need to comment out the lines specific to Option 1 and uncomment the lines specific to Option 2 in the ``build_macos_gnu`` modulefile. Additionally, users need to verify that all file paths reflect their system's configuration and that the correct version numbers for software libraries appear in the modulefile. | ||
|
||
.. code-block:: console | ||
|
||
# Option 1 compiler paths: | ||
setenv CC "/opt/homebrew/bin/gcc" | ||
setenv FC "/opt/homebrew/bin/gfortran" | ||
setenv CXX "/opt/homebrew/bin/g++" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. export CC=/opt/homebrew/bin/gcc |
||
|
||
# Option 2 compiler paths: | ||
#setenv CC "/usr/local/bin/gcc" | ||
#setenv FC "/usr/local/bin/gfortran" | ||
#setenv CXX "/usr/local/bin/g++" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. #export CC=/usr/local/bin/gcc |
||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Also, the following were set for the in build_macos_gnu after lines 222 (again, for bash shell syntax): export MPI_CC=mpicc export CMAKE_C_COMPILER=${MPI_CC} There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @natalie-perlin All of this is in the modulefile already, so I did not type it out in the instructions because the user should not have to adjust the code in any way. As mentioned above, it doesn't need to get changed to bash syntax because (if I understood @danielabdi-noaa correctly), the modulefile will work regardless of the user's shell. |
||
Then, users must source the Lmod setup file, just as they would on other systems, and load the modulefiles needed for building and running SRW App: | ||
|
||
.. code-block:: console | ||
|
||
source etc/lmod-setup.sh | ||
gspetro-NOAA marked this conversation as resolved.
Show resolved
Hide resolved
|
||
module use <path/to/ufs-srweather-app/modulefiles> | ||
module load build_macos_gnu | ||
|
||
.. note:: | ||
If you execute ``source etc/lmod-setup.sh`` on systems that don't need it, it will simply do a ``module purge``. | ||
|
||
Additionally, for Option 1 systems, set the variable ``ENABLE_QUAD_PRECISION`` to ``OFF`` in line 35 of the ``$SRW/src/ufs-weather-model/FV3/atmos_cubed_sphere/CMakeLists.txt`` file. This change is optional if using Option 2 to build the SRW App. Using a text editor (e.g., vi, vim, emacs): | ||
|
||
.. code-block:: console | ||
|
||
option(ENABLE_QUAD_PRECISION "Enable compiler definition -DENABLE_QUAD_PRECISION" OFF) | ||
|
||
An alternative way to make this change is using a `sed` (streamline editor). From the command line, users can run one of two commands (user's preference): | ||
|
||
.. code-block:: console | ||
|
||
sed -i -e 's/QUAD_PRECISION\" ON)/QUAD_PRECISION\" OFF)/' CMakeLists.txt | ||
sed -i -e 's/bin\/sh/bin\/bash/g' *sh | ||
|
||
|
||
.. _BuildExecutables: | ||
|
||
Build the Executables | ||
======================= | ||
|
||
Create a directory to hold the build's executables: | ||
Create a directory within ``$SRW`` to hold the build's executables: | ||
|
||
.. code-block:: console | ||
|
||
|
@@ -528,13 +599,17 @@ To get started, make a copy of ``config.community.sh``. From the ``ufs-srweather | |
|
||
.. code-block:: console | ||
|
||
cd regional_workflow/ush | ||
cd $SRW/regional_workflow/ush | ||
cp config.community.sh config.sh | ||
|
||
The default settings in this file include a predefined 25-km :term:`CONUS` grid (RRFS_CONUS_25km), the :term:`GFS` v16 physics suite (FV3_GFS_v16 :term:`CCPP`), and :term:`FV3`-based GFS raw external model data for initialization. | ||
|
||
Next, edit the new ``config.sh`` file to customize it for your machine. At a minimum, change the ``MACHINE`` and ``ACCOUNT`` variables; then choose a name for the experiment directory by setting ``EXPT_SUBDIR``. If you have pre-staged the initialization data for the experiment, set ``USE_USER_STAGED_EXTRN_FILES="TRUE"``, and set the paths to the data for ``EXTRN_MDL_SOURCE_BASEDIR_ICS`` and ``EXTRN_MDL_SOURCE_BASEDIR_LBCS``. | ||
gspetro-NOAA marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
.. note:: | ||
|
||
MacOS users should refer to :numref:`Section %s <MacConfig>` for details on configuring an experiment on MacOS. | ||
|
||
Sample settings are indicated below for Level 1 platforms. Detailed guidance applicable to all systems can be found in :numref:`Chapter %s: Configuring the Workflow <ConfigWorkflow>`, which discusses each variable and the options available. Additionally, information about the three predefined Limited Area Model (LAM) Grid options can be found in :numref:`Chapter %s: Limited Area Model (LAM) Grids <LAMGrids>`. | ||
|
||
.. important:: | ||
|
@@ -630,8 +705,7 @@ For WCOSS_CRAY: | |
.. note:: | ||
|
||
The values of the configuration variables should be consistent with those in the | ||
``valid_param_vals script``. In addition, various example configuration files can be | ||
found in the ``regional_workflow/tests/baseline_configs`` directory. | ||
``valid_param_vals script``. In addition, various example configuration files can be found in the ``regional_workflow/tests/baseline_configs`` directory. | ||
|
||
.. _VXConfig: | ||
|
||
|
@@ -686,8 +760,8 @@ These tasks are independent, so users may set some values to "TRUE" and others t | |
|
||
.. _SetUpPythonEnv: | ||
|
||
Set up the Python and other Environment Parameters | ||
-------------------------------------------------- | ||
Set Up the Python and Other Environment Parameters | ||
---------------------------------------------------- | ||
The workflow requires Python 3 with the packages 'PyYAML', 'Jinja2', and 'f90nml' available. This Python environment has already been set up on Level 1 platforms, and it can be activated in the following way (from ``/ufs-srweather-app/regional_workflow/ush``): | ||
|
||
.. code-block:: console | ||
|
@@ -702,6 +776,126 @@ This command will activate the ``regional_workflow`` conda environment. The user | |
source ~/.bashrc | ||
conda activate regional_workflow | ||
|
||
.. _MacConfig: | ||
|
||
Configuring an Experiment on MacOS | ||
------------------------------------------------------------ | ||
|
||
In principle, the configuration process for MacOS systems is the same as for other systems. However, the details of the configuration process on MacOS require a few extra steps. | ||
|
||
.. _MacMorePackages: | ||
|
||
Install Additional Packages | ||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ||
|
||
Check the version of bash, and upgrade it if it is lower than 4. Additionally, install the ``coreutils`` package: | ||
|
||
.. code-block:: console | ||
|
||
bash --version | ||
brew upgrade bash | ||
brew install coreutils | ||
|
||
.. _MacVEnv: | ||
|
||
Create a Python Virtual Environment | ||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ||
|
||
Users must create a python virtual environment for running the SRW on MacOS. This involves setting python3 as default, adding required python modules, and sourcing the ``regional_workflow``. | ||
|
||
.. code-block:: console | ||
|
||
python3 -m pip --version | ||
python3 -m pip install --upgrade pip | ||
python3 -m ensurepip --default-pip | ||
python3 -m venv $HOME/venv/regional_workflow | ||
source $HOME/venv/regional_workflow/bin/activate | ||
python3 -m pip install jinja2 | ||
python3 -m pip install pyyaml | ||
python3 -m pip install f90nml | ||
python3 -m pip install ruby OR: brew install ruby | ||
|
||
The virtual environment can be deactivated by running the ``deactivate`` command. The virtual environment built here will be reactivated in :numref:`Step %s <MacActivateWFenv>` and needs to be used to generate the workflow and run the experiment. | ||
|
||
Install Rocoto | ||
^^^^^^^^^^^^^^^^^^ | ||
|
||
.. note:: | ||
Users may `install Rocoto <https://github.com/christopherwharrop/rocoto/blob/develop/INSTALL>`__ if they want to make use of a workflow manager to run their experiments. However, this option has not been tested yet on MacOS and is not supported for this release. | ||
|
||
|
||
Configure the SRW | ||
^^^^^^^^^^^^^^^^^^^^ | ||
|
||
Users will need to configure their experiment just like on any other system. From the ``$SRW/regional_workflow/ush`` directory, users can copy the settings from ``config.community.sh`` into a ``config.sh`` file (see :numref:`Section %s <UserSpecificConfig>`) above. In the ``config.sh`` file, users should set ``MACHINE="macos"`` and modify additional variables as needed. For example: | ||
|
||
.. code-block:: console | ||
|
||
MACHINE="macos" | ||
ACCOUNT="user" | ||
EXPT_SUBDIR="<test_community>" | ||
COMPILER="gnu" | ||
VERBOSE="TRUE" | ||
RUN_ENVIR="community" | ||
PREEXISTING_DIR_METHOD="rename" | ||
|
||
PREDEF_GRID_NAME="RRFS_CONUS_25km" | ||
QUILTING="TRUE" | ||
|
||
Due to the limited number of processors on Mac OS systems, users must configure the domain decomposition defaults (usually, there are only 8 CPUs in M1-family chips and 4 CPUs for x86_64). | ||
|
||
For :ref:`Option 1 <MacDetails>`, add the following information to ``config.sh``: | ||
|
||
.. code-block:: console | ||
|
||
LAYOUT_X="${LAYOUT_X:-3}" | ||
LAYOUT_Y="${LAYOUT_Y:-2}" | ||
WRTCMP_write_groups="1" | ||
WRTCMP_write_tasks_per_group="2" | ||
|
||
For :ref:`Option 2 <MacDetails>`, add the following information to ``config.sh``: | ||
|
||
.. code-block:: console | ||
|
||
LAYOUT_X="${LAYOUT_X:-3}" | ||
LAYOUT_Y="${LAYOUT_Y:-1}" | ||
WRTCMP_write_groups="1" | ||
WRTCMP_write_tasks_per_group="1" | ||
|
||
Configure the Machine File | ||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ||
Configure the machine file based on the number of CPUs in the system (8 or 4). Specify the following variables in ``$SRW/regional_workflow/ush/machine/macos.sh``: | ||
|
||
For Option 1 (8 CPUs): | ||
|
||
.. code-block:: console | ||
|
||
# Architecture information | ||
WORKFLOW_MANAGER="none" | ||
NCORES_PER_NODE=${NCORES_PER_NODE:-8} (Option 2: when 4 CPUs, set to 4) | ||
SCHED=${SCHED:-"none"} | ||
# Run commands for executables | ||
RUN_CMD_SERIAL="time" | ||
RUN_CMD_UTILS="mpirun -np 4" | ||
RUN_CMD_FCST='mpirun -np ${PE_MEMBER01}' | ||
RUN_CMD_POST="mpirun -np 4" | ||
|
||
The same settings can be used for Option 2, except that ``NCORES_PER_NODE=${NCORES_PER_NODE:-8}`` should be set to 4 instead of 8. | ||
|
||
.. _MacActivateWFenv: | ||
|
||
Activate the Workflow Environment | ||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ||
|
||
The ``regional_workflow`` environment can be activated on MacOS as it is for any other system: | ||
|
||
.. code-block:: console | ||
|
||
cd $SRW/regional_workflow/ush | ||
source ../../env/wflow_macos.env | ||
gspetro-NOAA marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
This should activate the ``regional_workflow`` environment created in :numref:`Step %s <MacVEnv>`. From here, the user may continue to the :ref:`next step <GenerateWorkflow>` and generate the regional workflow. | ||
|
||
|
||
.. _GenerateWorkflow: | ||
|
||
|
@@ -891,7 +1085,11 @@ In addition to the baseline tasks described in :numref:`Table %s <WorkflowTasksT | |
.. _RocotoRun: | ||
|
||
Run the Workflow Using Rocoto | ||
============================= | ||
============================== | ||
|
||
.. attention:: | ||
If users are running the SRW App in a container or on a system that does not have Rocoto installed (e.g., `Level 3 & 4 <https://github.com/ufs-community/ufs-srweather-app/wiki/Supported-Platforms-and-Compilers>`__ systems, such as MacOS), they should follow the process outlined in :numref:`Section %s <RunUsingStandaloneScripts>` instead of the instructions in this section. | ||
|
||
The information in this section assumes that Rocoto is available on the desired platform. (Note that Rocoto cannot be used when running the workflow within a container.) If Rocoto is not available, it is still possible to run the workflow using stand-alone scripts according to the process outlined in :numref:`Section %s <RunUsingStandaloneScripts>`. There are two main ways to run the workflow with Rocoto: (1) with the ``launch_FV3LAM_wflow.sh`` script, and (2) by manually calling the ``rocotorun`` command. Users can also automate the workflow using a crontab. | ||
|
||
Optionally, an environment variable can be set to navigate to the ``$EXPTDIR`` more easily. If the login shell is bash, it can be set as follows: | ||
|
@@ -904,7 +1102,7 @@ If the login shell is csh/tcsh, it can be set using: | |
|
||
.. code-block:: console | ||
|
||
setenv EXPTDIR /path-to-experiment/directory | ||
setenv EXPTDIR /<path-to-experiment>/<directory_name> | ||
|
||
|
||
Launch the Rocoto Workflow Using a Script | ||
|
@@ -1067,6 +1265,7 @@ After finishing the experiment, open the crontab using ``crontab -e`` and delete | |
|
||
On Orion, *cron* is only available on the orion-login-1 node, so users will need to work on that node when running *cron* jobs on Orion. | ||
|
||
|
||
The workflow run is complete when all tasks have "SUCCEEDED", and the rocotostat command outputs a table similar to the one :ref:`above <Success>`. | ||
|
||
.. _PlotOutput: | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
export HPCstack=/Users/username/hpc-stack/install
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TCL modulefiles use setenv instead of export so I think it is fine as it is.