diff --git a/README.md b/README.md
index b30028ea..adcae6c1 100644
--- a/README.md
+++ b/README.md
@@ -4,33 +4,59 @@
## Introduction
-The main purpose of the Run 3 validation framework is to provide a compact and flexible tool for validation of the
+The Run 3 validation framework is a tool for an easy execution, testing and validation of the Run 3 analysis code on large local samples.
+
+Its features include
+
+* simple specification of input datasets,
+* simple configuration and activation of analysis tasks,
+* easy generation of the O2 command for complex workflow topologies,
+* job parallelisation,
+* output merging,
+* error checking and reporting,
+* specification of postprocessing.
+
+It also provides tools for:
+
+* post mortem debugging of failing jobs,
+* comparison of histograms between ROOT files,
+* visualisation of workflow dependencies,
+* downloading of data samples from the Grid,
+* maintenance of Git repositories and installations of aliBuild packages.
+
+The original purpose of the Run 3 validation framework was to provide a compact and flexible tool for validation of the
[O2(Physics)](https://github.com/AliceO2Group/O2Physics) analysis framework by comparison of its output to its
[AliPhysics](https://github.com/alisw/AliPhysics) counterpart.
The general idea is to run the same analysis using AliPhysics and O2(Physics) and produce comparison plots.
+However, it can be used without AliPhysics as well to run O2 analyses locally, similar to running trains on AliHyperloop.
+This makes it a convenient framework for local development, testing and debugging of O2(Physics) code.
+
## Overview
The validation framework is a general configurable platform that gives user the full control over what is done.
-Its flexibility is enabled by strict separation of its specialised components into a system of bash scripts.
+Its flexibility is enabled by strict separation of its specialised components into a system of Bash scripts.
Configuration is separate from execution code, input configuration is separate from task configuration, execution steps are separate from the main steering code.
* The steering script [`runtest.sh`](exec/runtest.sh) provides control parameters and interface to the machinery for task execution.
-* User provides configuration bash scripts which:
+* User provides configuration Bash scripts which:
* modify control parameters,
* produce modified configuration files,
* generate step scripts executed by the framework in the validation steps.
-### Execution
+## Execution
Execution code can be found in the [`exec`](exec) directory.
+**The user should not touch anything in this directory!**
+
The steering script [`runtest.sh`](exec/runtest.sh) performs the following execution steps:
+
* Load input specification.
* Load tasks configuration.
* Print out input description.
* Clean before running. (activated by `DOCLEAN=1`)
- * Deletes specified files.
+ * Deletes specified files (produced by previous runs).
* Generate list of input files.
* Modify the JSON file.
* Convert `AliESDs.root` to `AO2D.root`. (activated by `DOCONVERT=1`)
@@ -51,46 +77,107 @@ The steering script [`runtest.sh`](exec/runtest.sh) performs the following execu
* Executes the postprocessing step script.
* This step typically compares AliPhysics and O2 output and produces plots.
* Clean after running. (activated by `DOCLEAN=1`)
- * Deletes specified files.
+ * Deletes specified (temporary) files.
* Done
* This step is just a visual confirmation that all steps have finished without errors.
All steps are activated by default and some can be disabled individually by setting the respective activation variables to `0` in user's task configuration.
-### Configuration
+## Configuration
The steering script [`runtest.sh`](exec/runtest.sh) can be executed with the following optional arguments:
```bash
-bash [/]runtest.sh [-h] [-i ] [-t ] [-d]
+bash [/]runtest.sh [-h] [-i ] [-t ] [-d]
```
-`-h` Prints out the usage specification above.
+`` Input specification script. See [Input specification](#input-specification).
-`-d` (Debug mode) Prints out more information about settings and execution.
-
-`` Input specification
-* Bash script that modifies input parameters.
-* This script defines which data will be processed.
* Defaults to `config_input.sh` (in the current directory).
-`` Task configuration
-* Bash script that cleans the directory, deactivates steps, modifies the JSON file, generates step scripts.
-* This script defines what the validation steps will do.
+`` Task configuration script. See [Task configuration](#task-configuration).
+
* Defaults to `config_tasks.sh` (in the current directory).
-* Provides these mandatory functions:
- * `Clean` Performs cleanup before and after running.
- * `AdjustJson` Modifies the JSON file. (e.g. selection cut activation)
- * `MakeScriptAli` Generates the AliPhysics step script.
- * `MakeScriptO2` Generates the O2 step script.
- * `MakeScriptPostprocess` Generates the postprocessing step script. (e.g. plotting)
-* The `Clean` function takes one argument: `$1=1` before running, `$1=2` after running.
-* The AliPhysics and O2 step scripts take two arguments: `$1=""`, `$2=""`.
-* The postprocessing step script takes two arguments: `$1=""`, `$2=""`.
-Implementation of these configuration scripts is fully up to the user.
+`-d` Debug mode. Prints out more information about settings and execution.
+
+`-h` Help. Prints out the usage specification above.
+
+### Input specification
+
+The input specification script is a Bash script that sets input parameters used by the steering script.
+
+**This script defines which data will be processed and how.**
+
+These are the available input parameters and their default values:
+
+* `INPUT_LABEL="nothing"` Input description
+* `INPUT_DIR="$PWD"` Input directory
+* `INPUT_FILES="AliESDs.root"` Input file pattern
+* `INPUT_SYS="pp"` Collision system (`"pp"`, `"PbPb"`)
+* `INPUT_RUN=2` LHC Run (2, 3, 5)
+* `INPUT_IS_O2=0` Input files are in O2 format.
+* `INPUT_IS_MC=0` Input files are MC data.
+* `INPUT_PARENT_MASK=""` Path replacement mask for the input directory of parent files in case of linked derived O2 input. Set to `";"` if no replacement needed.
+* `JSON="dpl-config.json"` O2 device configuration
+
+This allows you to define several input datasets and switch between them easily by setting the corresponding value of `INPUT_CASE`.
+
+Other available parameters allow you to specify how many input files to process and how to parallelise the job execution.
+
+### Task configuration
-Dummy examples can be found in: [`config/config_input_dummy.sh`](config/config_input_dummy.sh), [`config/config_tasks_dummy.sh`](config/config_tasks_dummy.sh).
+The task configuration script is a Bash script that modifies the task parameters used by the steering script.
+
+**This script defines which validation steps will run and what they will do.**
+
+* It cleans the directory, deactivates incompatible steps, modifies the JSON file, generates step scripts.
+* The body of the script has to provide these mandatory functions:
+ * `Clean` Performs cleanup before and after running.
+ * `AdjustJson` Modifies the JSON file (e.g. selection cut activation).
+ * `MakeScriptAli` Generates the AliPhysics step script `script_ali.sh`.
+ * `MakeScriptO2` Generates the O2 step script `script_o2.sh`.
+ * `MakeScriptPostprocess` Generates the postprocessing step script `script_postprocess.sh` (e.g. plotting).
+* The `Clean` function takes one argument: `$1=1` for cleaning before running, `$1=2` for cleaning after running.
+* The AliPhysics and O2 step scripts take two arguments: `$1=""`, `$2=""`.
+* The postprocessing step script takes two arguments: `$1=""`, `$2=""`.
+
+Configuration that should be defined in the task configuration includes:
+
+* Deactivation of the validation steps (`DOCLEAN`, `DOCONVERT`, `DOALI`, `DOO2`, `DOPOSTPROCESS`)
+* Customisation of the commands loading the AliPhysics, O2Physics and postprocessing environments (`ENV_ALI`, `ENV_O2`, `ENV_POST`). By default the latest builds of AliPhysics, O2Physics and ROOT are used, respectively.
+* Any other parameters related to "what should run and how", e.g. `SAVETREES`, `MAKE_GRAPH`, `USEALIEVCUTS`
+
+### Workflow specification
+
+The full O2 command, executed in the O2 step script to run the activated O2 workflows, is generated in the `MakeScriptO2` function using a dedicated Python script [`make_command_o2.py`](exec/make_command_o2.py).
+This script generates the command using a **YAML database (`workflows.yml`) that specifies workflow options and how workflows depend on each other**.
+You can consider a workflow specification in this database to be the equivalent of a wagon definition on AliHyperloop, including the definition of the wagon name, the workflow name, the dependencies and the derived data. The main difference is that the device configuration is stored in the JSON file.
+
+The workflow database has two sections: `options` and `workflows`.
+The `options` section defines `global` options, used once at the end of the command, and `local` options, used for every workflow.
+The `workflows` section contains the "wagon" definitions.
+The available parameters are:
+
+* `executable` Workflow command, if different from the "wagon" name
+ * This allows you to define multiple wagons for the same workflow.
+* `dependencies` **Direct** dependencies (i.e. other wagons **directly** needed to run this wagon)
+ * Allowed formats: string, list of strings
+ * Direct dependencies are wagons that produce tables consumed by this wagon. You can figure them out using the [`find_dependencies.py`](https://github.com/AliceO2Group/O2Physics/blob/master/Scripts/find_dependencies.py) script in O2Physics.
+* `requires_mc` Boolean parameter to specify whether the workflow can only run on MC
+* `options` Command line options. (Currently not supported on AliHyperloop.)
+ * Allowed formats: string, list of strings, dictionary with keys `default`, `real`, `mc`
+* `tables` Descriptions of output tables to be saved as trees
+ * Allowed formats: string, list of strings, dictionary with keys `default`, `real`, `mc`
+
+The `make_command_o2.py` script allows you to generate a topology graph to visualise the dependencies defined in the database, using [Graphviz](https://graphviz.org/).
+Generation of the topology graph can be conveniently enabled with `MAKE_GRAPH=1` in the task configuration.
+
+Dummy examples of the configuration files can be found in:
+
+* [`config/config_input_dummy.sh`](config/config_input_dummy.sh),
+* [`config/config_tasks_dummy.sh`](config/config_tasks_dummy.sh),
+* [`config/workflows_dummy.yml`](config/workflows_dummy.yml).
## Preparation
@@ -136,7 +223,7 @@ sudo apt install parallel
Now you are ready to run the validation code.
-**Make sure that your bash environment is clean!
+**Make sure that your Bash environment is clean!
Do not load ROOT, AliPhysics, O2, O2Physics or any other aliBuild package environment before running the framework!**
Enter any directory and execute the steering script `runtest.sh`.
@@ -156,12 +243,25 @@ If any step fails, the script will display an error message and you should look
If the main log file of a validation step mentions "parallel: This job failed:", inspect the respective log file in the directory of the corresponding job.
+## How to add a new workflow
+
+To add a new workflow in the framework configuration, you need to follow these steps.
+
+* Add the workflow in the [task configuration](#task-configuration):
+ * Add the activation switch: `DOO2_...=0 # name of the workflow (without o2-analysis)`.
+ * Add the application of the switch in the `MakeScriptO2` function: `[ $DOO2_... -eq 1 ] && WORKFLOWS+=" o2-analysis-..."`.
+ * If needed, add lines in the `AdjustJson` function to modify the JSON configuration.
+* Add the [workflow specification](#workflow-specification) in the workflow database:
+ * See the dummy example `o2-analysis-workflow` for the full list of options.
+* Add the device configuration in the default JSON file.
+
## Job debugging
If you run many parallelised jobs and some of them don't finish successfully, you can make use of the debugging script [`debug.sh`](exec/debug.sh) in the [`exec`](exec) directory
which can help you figure out what went wrong, where and why.
You can execute the script from the current working directory using the following syntax (options can be combined):
+
```bash
bash [/]debug.sh [-h] [-t TYPE] [-b [-u]] [-f] [-w] [-e]
```
@@ -180,10 +280,16 @@ bash [/]debug.sh [-h] [-t TYPE] [-b [-u]] [-f] [-w] [-e]
`-e` Show errors (for all jobs).
-## Heavy-flavour analyses
+## Specific analyses
+
+### Heavy-flavour analyses
Enter the [`codeHF`](codeHF) directory and see the [`README`](codeHF/README.md).
+### Jet analyses
+
+Enter the [`codeJE`](codeJE) directory.
+
## Keep your repositories and installations up to date and clean
With the ongoing fast development, it can easily happen that updating the O2Physics part of the validation
@@ -197,8 +303,9 @@ This includes updating alidist, AliPhysics, O2(Physics), and this Run
as well as re-building your AliPhysics and O2(Physics) installations via aliBuild and deleting obsolete builds.
You can execute the script from any directory on your system using the following syntax:
+
```bash
-python /exec/update_packages.py [-h] [-d] [-l] [-c] database
+python [/]exec/update_packages.py [-h] [-d] [-l] [-c] database
```
optional arguments:
@@ -245,7 +352,7 @@ It is possible to check your code locally (before even committing or pushing):
### Space checker
```bash
-bash /exec/check_spaces.sh
+bash [/]exec/check_spaces.sh
```
### [ClangFormat](https://clang.llvm.org/docs/ClangFormat.html)
@@ -254,7 +361,7 @@ bash /exec/check_spaces.sh
clang-format -style=file -i
```
-### [MegaLinter](https://oxsecurity.github.io/megalinter/latest/mega-linter-runner/)
+### [MegaLinter](http://megalinter.io/latest/mega-linter-runner/)
```bash
npx mega-linter-runner
diff --git a/codeHF/README.md b/codeHF/README.md
index dc68e75c..47c58395 100644
--- a/codeHF/README.md
+++ b/codeHF/README.md
@@ -55,13 +55,3 @@ The postprocessing step produces several plots `comparison_histos_(...).pdf`, `M
To confirm that the output of the default settings looks as expected, compare the produced plots with their reference counterparts `(...)_ref.pdf`.
The complete list of commit hashes used to produce the reference plots can be found in `versions_ref.txt`.
-
-## Add a new workflow
-
-- Add the workflow in the task configuration ([`config_task.sh`](config_tasks.sh)):
- - Add the activation switch: `DOO2_...=0 # name of the workflow (without o2-analysis)`.
- - Add the application of the switch in the `MakeScriptO2` function: `[ $DOO2_... -eq 1 ] && WORKFLOWS+=" o2-analysis-..."`.
- - If needed, add lines in the `AdjustJson` function to modify the JSON configuration.
-- Add the workflow specification in the workflow database ([`workflows.yml`](workflows.yml)):
- - See the dummy example `o2-analysis-workflow` for the full list of options.
-- Add the device configuration in the default JSON file ([`dpl-config_run3.json`](dpl-config_run3.json)).
diff --git a/codeHF/config_input.sh b/codeHF/config_input.sh
index cba1df0c..262d5389 100644
--- a/codeHF/config_input.sh
+++ b/codeHF/config_input.sh
@@ -8,29 +8,30 @@ INPUT_CASE=2 # Input case
NFILESMAX=1 # Maximum number of processed input files. (Set to -0 to process all; to -N to process all but the last N files.)
-# Number of input files per job (Automatic optimisation on if < 1.)
+# Number of input files per job. (Will be automatically optimised if set to 0.)
NFILESPERJOB_CONVERT=0 # Conversion
NFILESPERJOB_ALI=0 # AliPhysics
NFILESPERJOB_O2=1 # O2
-# Maximum number of simultaneously running O2 jobs
+# Maximum number of simultaneously running O2 jobs. (Adjust it based on available memory.)
NJOBSPARALLEL_O2=$(python3 -c "print(min(10, round($(nproc) / 2)))")
-JSONRUN3="dpl-config_run3.json" # Run 3 tasks parameters
-# Run 5 tasks parameters for open HF study
-JSONRUN5_HF="dpl-config_run5_hf.json"
-# Run 5 tasks parameters for onia studies:
-# J/psi and X (higher pt cut on 2-prong decay tracks and no DCA cut on single track)
-JSONRUN5_ONIAX="dpl-config_run5_oniaX.json"
-JSON="$JSONRUN3"
-
# Default settings:
-# INPUT_FILES="AliESDs.root"
-# INPUT_SYS="pp"
-# INPUT_RUN=2
-# INPUT_IS_O2=0
-# INPUT_IS_MC=0
-# JSON="$JSONRUN3"
+# INPUT_LABEL="nothing" # Input description
+# INPUT_DIR="$PWD" # Input directory
+# INPUT_FILES="AliESDs.root" # Input file pattern
+# INPUT_SYS="pp" # Collision system ("pp", "PbPb")
+# INPUT_RUN=2 # LHC Run (2, 3, 5)
+# INPUT_IS_O2=0 # Input files are in O2 format.
+# INPUT_IS_MC=0 # Input files are MC data.
+# INPUT_PARENT_MASK="" # Path replacement mask for the input directory of parent files in case of linked derived O2 input. Set to ";" if no replacement needed.
+# JSON="dpl-config.json" # O2 device configuration
+
+# O2 device configuration
+JSONRUN3="dpl-config_run3.json" # Run 3
+# JSONRUN5_HF="dpl-config_run5_hf.json" # Run 5, open HF
+# JSONRUN5_ONIAX="dpl-config_run5_oniaX.json" # Run 5, onia (J/psi and X), (higher pt cut on 2-prong decay tracks and no DCA cut on single track)
+JSON="$JSONRUN3"
INPUT_BASE="/data2/data" # alicecerno2
diff --git a/codeHF/config_tasks.sh b/codeHF/config_tasks.sh
index 0a607142..b6de8d63 100644
--- a/codeHF/config_tasks.sh
+++ b/codeHF/config_tasks.sh
@@ -13,7 +13,7 @@
####################################################################################################
-# Here you can select the AliPhysics and O2Physics branches to load.
+# Here you can select the AliPhysics and O2Physics Git branches to load. (You need to have them built with aliBuild.)
# BRANCH_ALI="master"
# ENV_ALI="alienv setenv AliPhysics/latest-${BRANCH_ALI}-o2 -c"
# BRANCH_O2="master"
@@ -29,9 +29,8 @@ DOPOSTPROCESS=1 # Run output postprocessing. (Comparison plots. Requires DOA
# Disable incompatible steps.
[ "$INPUT_IS_O2" -eq 1 ] && { DOCONVERT=0; DOALI=0; }
-# O2 database
-DATABASE_O2="workflows.yml"
-MAKE_GRAPH=0 # Make topology graph.
+DATABASE_O2="workflows.yml" # Workflow specification database
+MAKE_GRAPH=0 # Make topology graph.
# Activation of O2 workflows
# Trigger selection
diff --git a/codeHF/dpl-config_run3.json b/codeHF/dpl-config_run3.json
index 08222072..c1b13c7f 100644
--- a/codeHF/dpl-config_run3.json
+++ b/codeHF/dpl-config_run3.json
@@ -6,6 +6,7 @@
"start-value-enumeration": "0",
"end-value-enumeration": "-1",
"step-value-enumeration": "1",
+ "aod-file-private": "@list_o2.txt",
"aod-file": "@list_o2.txt",
"aod-parent-base-path-replacement": "PARENT_PATH_MASK",
"aod-parent-access-level": 1
diff --git a/codeHF/workflows.yml b/codeHF/workflows.yml
index 665bdfe7..6e8ee012 100644
--- a/codeHF/workflows.yml
+++ b/codeHF/workflows.yml
@@ -20,7 +20,7 @@ workflows:
# default: ""
# real: ""
# mc: "--doMC"
- tables: [] # descriptions of tables to be saved in the output tree (format: str, list), see more detailed format below
+ tables: [] # descriptions of output tables to be saved as trees (format: str, list), see more detailed format below
# tables:
# default: []
# real: []
diff --git a/codeJE/config_input.sh b/codeJE/config_input.sh
index 9a399e08..4d3a86b9 100644
--- a/codeJE/config_input.sh
+++ b/codeJE/config_input.sh
@@ -4,28 +4,28 @@
# Input specification for runtest.sh
# (Modifies input parameters.)
-INPUT_CASE=13 # Input case
+INPUT_CASE=2 # Input case
NFILESMAX=1 # Maximum number of processed input files. (Set to -0 to process all; to -N to process all but the last N files.)
-# Number of input files per job (Automatic optimisation on if < 1.)
+# Number of input files per job. (Will be automatically optimised if set to 0.)
NFILESPERJOB_CONVERT=0 # Conversion
NFILESPERJOB_ALI=0 # AliPhysics
NFILESPERJOB_O2=1 # O2
-# Maximum number of simultaneously running O2 jobs
+# Maximum number of simultaneously running O2 jobs. (Adjust it based on available memory.)
NJOBSPARALLEL_O2=$(python3 -c "print(min(10, round($(nproc) / 2)))")
-JSONRUN3="dpl-config.json" # Run 3 tasks parameters
-JSON="$JSONRUN3"
-
# Default settings:
-# INPUT_FILES="AliESDs.root"
-# INPUT_SYS="pp"
-# INPUT_RUN=2
-# INPUT_IS_O2=0
-# INPUT_IS_MC=0
-# JSON="$JSONRUN3"
+# INPUT_LABEL="nothing" # Input description
+# INPUT_DIR="$PWD" # Input directory
+# INPUT_FILES="AliESDs.root" # Input file pattern
+# INPUT_SYS="pp" # Collision system ("pp", "PbPb")
+# INPUT_RUN=2 # LHC Run (2, 3, 5)
+# INPUT_IS_O2=0 # Input files are in O2 format.
+# INPUT_IS_MC=0 # Input files are MC data.
+# INPUT_PARENT_MASK="" # Path replacement mask for the input directory of parent files in case of linked derived O2 input. Set to ";" if no replacement needed.
+# JSON="dpl-config.json" # O2 device configuration
INPUT_BASE="/data2/data" # alicecerno2
diff --git a/codeJE/config_tasks.sh b/codeJE/config_tasks.sh
index 28c11e5b..88727b6b 100644
--- a/codeJE/config_tasks.sh
+++ b/codeJE/config_tasks.sh
@@ -13,7 +13,7 @@
####################################################################################################
-# Here you can select the AliPhysics and O2Physics branches to load.
+# Here you can select the AliPhysics and O2Physics Git branches to load. (You need to have them built with aliBuild.)
# BRANCH_ALI="master"
# ENV_ALI="alienv setenv AliPhysics/latest-${BRANCH_ALI}-o2 -c"
# BRANCH_O2="master"
@@ -29,9 +29,8 @@ DOPOSTPROCESS=1 # Run output postprocessing. (Comparison plots. Requires DOA
# Disable incompatible steps.
[ "$INPUT_IS_O2" -eq 1 ] && { DOCONVERT=0; DOALI=0; }
-# O2 database
-DATABASE_O2="workflows.yml"
-MAKE_GRAPH=0 # Make topology graph.
+DATABASE_O2="workflows.yml" # Workflow specification database
+MAKE_GRAPH=0 # Make topology graph.
# Activation of O2 workflows
# Table producers
@@ -49,6 +48,7 @@ DOO2_CONV_COLL=0 # collision-converter
DOO2_CONV_ZDC=1 # zdc-converter
DOO2_CONV_BC=1 # bc-converter
DOO2_CONV_TRKEX=1 # tracks-extra-converter
+DOO2_CONV_V0=0 # v0converter
SAVETREES=0 # Save O2 tables to trees.
USEO2VERTEXER=1 # Use the O2 vertexer in AliPhysics.
@@ -156,6 +156,7 @@ function MakeScriptO2 {
[ $DOO2_CONV_ZDC -eq 1 ] && WORKFLOWS+=" o2-analysis-zdc-converter"
[ $DOO2_CONV_BC -eq 1 ] && WORKFLOWS+=" o2-analysis-bc-converter"
[ $DOO2_CONV_TRKEX -eq 1 ] && WORKFLOWS+=" o2-analysis-tracks-extra-converter"
+ [ $DOO2_CONV_V0 -eq 1 ] && WORKFLOWS+=" o2-analysis-v0converter"
# Translate options into arguments of the generating script.
OPT_MAKECMD=""
diff --git a/codeJE/dpl-config.json b/codeJE/dpl-config.json
index 05116b11..ef43a1d3 100644
--- a/codeJE/dpl-config.json
+++ b/codeJE/dpl-config.json
@@ -7,6 +7,7 @@
"start-value-enumeration": "0",
"end-value-enumeration": "-1",
"step-value-enumeration": "1",
+ "aod-file-private": "@list_o2.txt",
"aod-file": "@list_o2.txt",
"aod-parent-base-path-replacement": "PARENT_PATH_MASK",
"aod-parent-access-level": 1
diff --git a/codeJE/workflows.yml b/codeJE/workflows.yml
index 8a39c1f7..00d20f6b 100644
--- a/codeJE/workflows.yml
+++ b/codeJE/workflows.yml
@@ -20,7 +20,7 @@ workflows:
# default: ""
# real: ""
# mc: "--doMC"
- tables: [] # descriptions of tables to be saved in the output tree (format: str, list), see more detailed format below
+ tables: [] # descriptions of output tables to be saved as trees (format: str, list), see more detailed format below
# tables:
# default: []
# real: []
@@ -94,4 +94,6 @@ workflows:
o2-analysis-tracks-extra-converter: {}
+ o2-analysis-v0converter: {}
+
o2-analysis-calo-label-converter: {}
diff --git a/config/config_input_dummy.sh b/config/config_input_dummy.sh
index 4689e964..dbb189da 100644
--- a/config/config_input_dummy.sh
+++ b/config/config_input_dummy.sh
@@ -8,21 +8,27 @@ INPUT_CASE=1 # Input case
NFILESMAX=1 # Maximum number of processed input files. (Set to -0 to process all; to -N to process all but the last N files.)
-# Number of input files per job (Automatic optimisation on if < 1.)
+# Number of input files per job. (Will be automatically optimised if set to 0.)
NFILESPERJOB_CONVERT=0 # Conversion
NFILESPERJOB_ALI=0 # AliPhysics
NFILESPERJOB_O2=1 # O2
-# Maximum number of simultaneously running O2 jobs
+# Maximum number of simultaneously running O2 jobs. (Adjust it based on available memory.)
NJOBSPARALLEL_O2=$(python3 -c "print(min(10, round($(nproc) / 2)))")
# Default settings:
-# INPUT_FILES="AliESDs.root"
-# INPUT_SYS="pp"
-# INPUT_RUN=2
-# INPUT_IS_O2=0
-# INPUT_IS_MC=0
-# JSON="dpl-config.json"
+# INPUT_LABEL="nothing" # Input description
+# INPUT_DIR="$PWD" # Input directory
+# INPUT_FILES="AliESDs.root" # Input file pattern
+# INPUT_SYS="pp" # Collision system ("pp", "PbPb")
+# INPUT_RUN=2 # LHC Run (2, 3, 5)
+# INPUT_IS_O2=0 # Input files are in O2 format.
+# INPUT_IS_MC=0 # Input files are MC data.
+# INPUT_PARENT_MASK="" # Path replacement mask for the input directory of parent files in case of linked derived O2 input. Set to ";" if no replacement needed.
+# JSON="dpl-config.json" # O2 device configuration
+
+# O2 device configuration
+JSON="dpl-config_dummy.json"
INPUT_BASE="/data"
diff --git a/config/config_tasks_dummy.sh b/config/config_tasks_dummy.sh
index b81f06f0..ead1e4ba 100644
--- a/config/config_tasks_dummy.sh
+++ b/config/config_tasks_dummy.sh
@@ -13,24 +13,37 @@
####################################################################################################
+# Here you can select the AliPhysics and O2Physics Git branches to load. (You need to have them built with aliBuild.)
+# BRANCH_ALI="master"
+# ENV_ALI="alienv setenv AliPhysics/latest-${BRANCH_ALI}-o2 -c"
+# BRANCH_O2="master"
+# ENV_O2="alienv setenv O2Physics/latest-${BRANCH_O2}-o2 -c"
+
# Steps
DOCLEAN=1 # Delete created files (before and after running tasks).
DOCONVERT=1 # Convert AliESDs.root to AO2D.root.
DOALI=1 # Run AliPhysics tasks.
DOO2=1 # Run O2 tasks.
-DOPOSTPROCESS=1 # Run output postprocessing. (Compare AliPhysics and O2 output.)
+DOPOSTPROCESS=1 # Run output postprocessing. (Comparison plots. Requires DOALI=1 and/or DOO2=1)
# Disable incompatible steps.
[ "$INPUT_IS_O2" -eq 1 ] && { DOCONVERT=0; DOALI=0; }
-# O2 database
-DATABASE_O2="workflows_dummy.yml"
-MAKE_GRAPH=0 # Make topology graph.
+DATABASE_O2="workflows_dummy.yml" # Workflow specification database
+MAKE_GRAPH=0 # Make topology graph.
# Activation of O2 workflows
# Trigger selection
DOO2_EVTSEL=1 # event-selection
DOO2_TRACKSEL=1 # trackselection
+# Converters
+DOO2_CONV_MC=0 # mc-converter
+DOO2_CONV_FDD=0 # fdd-converter
+DOO2_CONV_COLL=0 # collision-converter
+DOO2_CONV_ZDC=1 # zdc-converter
+DOO2_CONV_BC=1 # bc-converter
+DOO2_CONV_TRKEX=1 # tracks-extra-converter
+DOO2_CONV_V0=0 # v0converter
SAVETREES=0 # Save O2 tables to trees.
@@ -45,7 +58,7 @@ function Clean {
[ "$1" -eq 2 ] && {
rm -f "$LISTFILES_ALI" "$LISTFILES_O2" "$SCRIPT_ALI" "$SCRIPT_O2" "$SCRIPT_POSTPROCESS" || ErrExit "Failed to rm created files."
[ "$JSON_EDIT" ] && { rm "$JSON_EDIT" || ErrExit "Failed to rm $JSON_EDIT."; }
- rm "$DATABASE_O2_EDIT" || ErrExit "Failed to rm $DATABASE_O2_EDIT."
+ [ "$DATABASE_O2_EDIT" ] && { rm "$DATABASE_O2_EDIT" || ErrExit "Failed to rm $DATABASE_O2_EDIT."; }
}
return 0
@@ -59,6 +72,11 @@ function AdjustJson {
cp "$JSON" "$JSON_EDIT" || ErrExit "Failed to cp $JSON $JSON_EDIT."
JSON="$JSON_EDIT"
+ # Derived AO2D input
+ if [ "$INPUT_PARENT_MASK" ]; then
+ ReplaceString "PARENT_PATH_MASK" "$INPUT_PARENT_MASK" "$JSON" || ErrExit "Failed to edit $JSON."
+ fi
+
# Collision system
MsgWarn "Setting collision system $INPUT_SYS"
@@ -93,11 +111,23 @@ function MakeScriptO2 {
SUFFIX_RUN_MASK="_runX" # suffix mask to be replaced in the workflow names
SUFFIX_RUN="_run${INPUT_RUN}" # the actual suffix to be used instead of the mask
+ # Suffix to distinguish the workflows that run on derived data with parent access (skims)
+ SUFFIX_DER_MASK="_derX" # suffix mask to be replaced in the workflow names
+ [ "$INPUT_PARENT_MASK" ] && SUFFIX_DER="_derived" || SUFFIX_DER="" # the actual suffix to be used instead of the mask
+
WORKFLOWS=""
[ $DOO2_EVTSEL -eq 1 ] && WORKFLOWS+=" o2-analysis-event-selection"
[ $DOO2_TRACKSEL -eq 1 ] && WORKFLOWS+=" o2-analysis-trackselection${SUFFIX_RUN}"
-
- # Translate options into arguments of the generating script.
+ # Converters
+ [ $DOO2_CONV_MC -eq 1 ] && WORKFLOWS+=" o2-analysis-mc-converter"
+ [ $DOO2_CONV_FDD -eq 1 ] && WORKFLOWS+=" o2-analysis-fdd-converter"
+ [ $DOO2_CONV_COLL -eq 1 ] && WORKFLOWS+=" o2-analysis-collision-converter"
+ [ $DOO2_CONV_ZDC -eq 1 ] && WORKFLOWS+=" o2-analysis-zdc-converter"
+ [ $DOO2_CONV_BC -eq 1 ] && WORKFLOWS+=" o2-analysis-bc-converter"
+ [ $DOO2_CONV_TRKEX -eq 1 ] && WORKFLOWS+=" o2-analysis-tracks-extra-converter"
+ [ $DOO2_CONV_V0 -eq 1 ] && WORKFLOWS+=" o2-analysis-v0converter"
+
+ # Translate options into arguments of the generating script.
OPT_MAKECMD=""
[ "$INPUT_IS_MC" -eq 1 ] && OPT_MAKECMD+=" --mc"
[ "$DEBUG" -eq 1 ] && OPT_MAKECMD+=" -d"
@@ -111,6 +141,7 @@ function MakeScriptO2 {
# Replace the workflow version masks with the actual values in the workflow database.
ReplaceString "$SUFFIX_RUN_MASK" "$SUFFIX_RUN" "$DATABASE_O2" || ErrExit "Failed to edit $DATABASE_O2."
+ ReplaceString "$SUFFIX_DER_MASK" "$SUFFIX_DER" "$DATABASE_O2" || ErrExit "Failed to edit $DATABASE_O2."
# Generate the O2 command.
MAKECMD="python3 $DIR_EXEC/make_command_o2.py $DATABASE_O2 $OPT_MAKECMD"
diff --git a/config/dpl-config_dummy.json b/config/dpl-config_dummy.json
new file mode 100644
index 00000000..43d97550
--- /dev/null
+++ b/config/dpl-config_dummy.json
@@ -0,0 +1,14 @@
+{
+ "internal-dpl-aod-reader": {
+ "time-limit": "0",
+ "orbit-offset-enumeration": "0",
+ "orbit-multiplier-enumeration": "0",
+ "start-value-enumeration": "0",
+ "end-value-enumeration": "-1",
+ "step-value-enumeration": "1",
+ "aod-file": "@list_o2.txt",
+ "aod-file-private": "@list_o2.txt",
+ "aod-parent-base-path-replacement": "PARENT_PATH_MASK",
+ "aod-parent-access-level": 1
+ }
+}
diff --git a/config/workflows_dummy.yml b/config/workflows_dummy.yml
index 22efeb58..19519804 100644
--- a/config/workflows_dummy.yml
+++ b/config/workflows_dummy.yml
@@ -6,6 +6,7 @@ options:
- "--configuration json://$JSON"
- "--aod-memory-rate-limit 2000000000"
- "--shm-segment-size 16000000000"
+ - "--resources-monitoring 2"
- "--min-failure-level error"
workflows:
@@ -19,7 +20,7 @@ workflows:
# default: ""
# real: ""
# mc: "--doMC"
- tables: [] # descriptions of tables to be saved in the output tree (format: str, list), see more detailed format below
+ tables: [] # descriptions of output tables to be saved as trees (format: str, list), see more detailed format below
# tables:
# default: []
# real: []
@@ -27,6 +28,9 @@ workflows:
# Helper tasks
+ o2-analysis-track-to-collision-associator:
+ tables: HFTRACKASSOC
+
o2-analysis-timestamp: {}
o2-analysis-trackselection_run2:
@@ -46,10 +50,17 @@ workflows:
o2-analysis-track-dca_run3:
executable: o2-analysis-track-propagation
+ dependencies: o2-analysis-timestamp
o2-analysis-track-dca_run5:
executable: o2-analysis-alice3-trackextension
+ o2-analysis-centrality_run2:
+ executable: o2-analysis-centrality-table
+
+ o2-analysis-centrality_run3:
+ executable: o2-analysis-centrality-table
+
o2-analysis-centrality_run5:
executable: o2-analysis-alice3-centrality
dependencies: o2-analysis-track-dca_run5
@@ -64,21 +75,30 @@ workflows:
executable: o2-analysis-multiplicity-table
dependencies: o2-analysis-event-selection
- o2-analysis-centrality-table: {}
+ o2-analysis-ft0-corrected-table: {}
+
+ # PID
o2-analysis-pid-tpc-base: {}
o2-analysis-pid-tpc-full:
dependencies: [o2-analysis-pid-tpc-base, o2-analysis-timestamp]
- o2-analysis-pid-tof-base: {}
+ o2-analysis-pid-tof-base_run2:
+ executable: o2-analysis-pid-tof-base
+ dependencies: o2-analysis-event-selection
+
+ o2-analysis-pid-tof-base_run3:
+ executable: o2-analysis-pid-tof-base
+ dependencies: [o2-analysis-event-selection, o2-analysis-ft0-corrected-table]
- o2-analysis-pid-tof-full_run2: &tof_full
+ o2-analysis-pid-tof-full_run2:
executable: o2-analysis-pid-tof-full
- dependencies: [o2-analysis-pid-tof-base, o2-analysis-timestamp]
+ dependencies: [o2-analysis-pid-tof-base_run2, o2-analysis-timestamp]
o2-analysis-pid-tof-full_run3:
- <<: *tof_full
+ executable: o2-analysis-pid-tof-full
+ dependencies: [o2-analysis-pid-tof-base_run3, o2-analysis-timestamp]
o2-analysis-pid-tof-full_run5:
executable: o2-analysis-alice3-pid-tof
@@ -88,12 +108,22 @@ workflows:
o2-analysis-pid-tof-beta: {}
+ # Converters
+
o2-analysis-mc-converter: {}
o2-analysis-fdd-converter: {}
o2-analysis-collision-converter: {}
+ o2-analysis-zdc-converter: {}
+
+ o2-analysis-bc-converter: {}
+
+ o2-analysis-tracks-extra-converter: {}
+
+ o2-analysis-v0converter: {}
+
# LF
o2-analysis-lf-lambdakzerobuilder:
diff --git a/exec/runtest.sh b/exec/runtest.sh
index 756c961c..335859ec 100644
--- a/exec/runtest.sh
+++ b/exec/runtest.sh
@@ -4,7 +4,15 @@
# Steering script to run Run 2 to Run 3 conversion, AliPhysics tasks, O2 tasks, and postprocessing
####################################################################################################
-# Default settings
+# Declarations of global parameters and their default values
+
+# Main directories
+DIR_EXEC="$(dirname "$(realpath "$0")")" # Directory of this script (and other execution code)
+DIR_TASKS="$PWD" # Directory with task configuration
+
+# Configuration scripts
+CONFIG_INPUT="config_input.sh" # Input specification (Modifies input parameters.)
+CONFIG_TASKS="config_tasks.sh" # Task configuration (Cleans directory, modifies step activation, modifies JSON and generates step scripts via functions Clean, AdjustJson, MakeScriptAli, MakeScriptO2, MakeScriptPostprocess.)
# Steps
DOCLEAN=1 # Delete created files (before and after running tasks).
@@ -13,52 +21,47 @@ DOALI=1 # Run AliPhysics tasks.
DOO2=1 # Run O2 tasks.
DOPOSTPROCESS=1 # Run output postprocessing. (Compare AliPhysics and O2 output.)
-# Configuration scripts
-CONFIG_INPUT="config_input.sh" # Input specification (Modifies input parameters.)
-CONFIG_TASKS="config_tasks.sh" # Tasks configuration (Cleans directory, modifies step activation, modifies JSON and generates step scripts via functions Clean, AdjustJson, MakeScriptAli, MakeScriptO2, MakeScriptPostprocess.)
-
# Input parameters
INPUT_CASE=-1 # Input case
INPUT_LABEL="nothing" # Input description
INPUT_DIR="$PWD" # Input directory
-INPUT_PARENT_MASK="" # Path replacement mask for the input directory of parent files in case of derived input AO2D.root. Set to ";" if no replacement needed.
INPUT_FILES="AliESDs.root" # Input file pattern
-INPUT_SYS="pp" # Collision system
+INPUT_SYS="pp" # Collision system ("pp", "PbPb")
INPUT_RUN=2 # LHC Run (2, 3, 5)
-JSON="dpl-config.json" # Tasks parameters
INPUT_IS_O2=0 # Input files are in O2 format.
INPUT_IS_MC=0 # Input files are MC data.
+INPUT_PARENT_MASK="" # Path replacement mask for the input directory of parent files in case of linked derived O2 input. Set to ";" if no replacement needed.
+JSON="dpl-config.json" # O2 device configuration
+
+# Processing
NFILESMAX=1 # Maximum number of processed input files. (Set to -0 to process all; to -N to process all but the last N files.)
NFILESPERJOB_CONVERT=1 # Number of input files per conversion job
NFILESPERJOB_ALI=1 # Number of input files per AliPhysics job
NFILESPERJOB_O2=1 # Number of input files per O2 job
-# Other options
-SAVETREES=0 # Save O2 tables to trees.
-DEBUG=0 # Print out more information.
-USEALIEVCUTS=0 # Use AliEventCuts in AliPhysics (as used by conversion task)
-
# Performance
NCORES=$(nproc) # Ideal number of used cores
NCORESPERJOB_ALI=1 # Average number of cores used by one AliPhysics job
NCORESPERJOB_O2=1.6 # Average number of cores used by one O2 job
NJOBSPARALLEL_O2=$(nproc) # Maximum number of simultaneously running O2 jobs
-# This directory
-DIR_EXEC="$(dirname "$(realpath "$0")")"
+# Other options
+SAVETREES=0 # Save O2 tables to trees.
+DEBUG=0 # Print out more information.
+USEALIEVCUTS=0 # Use AliEventCuts in AliPhysics (as used by conversion task)
# Lists of input files
LISTFILES_ALI="list_ali.txt" # conversion and AliPhysics input
LISTFILES_O2="list_o2.txt" # O2 input
-# Output files names
+# Output files
FILEOUT="AnalysisResults.root"
FILEOUT_ALI="AnalysisResults_ALI.root"
FILEOUT_O2="AnalysisResults_O2.root"
FILEOUT_TREES="AnalysisResults_trees.root"
FILEOUT_TREES_O2="AnalysisResults_trees_O2.root"
-# Steering commands
+# Steering commands (loading aliBuild environments)
ENV_ALI="alienv setenv AliPhysics/latest -c"
ENV_O2="alienv setenv O2Physics/latest -c"
ENV_POST="alienv setenv ROOT/latest -c"
@@ -68,6 +71,9 @@ SCRIPT_O2="script_o2.sh"
SCRIPT_ALI="script_ali.sh"
SCRIPT_POSTPROCESS="script_postprocess.sh"
+# End of declarations of global parameters and their default values
+####################################################################################################
+
# Load utilities.
source "$DIR_EXEC/utilities.sh" || { echo "Error: Failed to load utilities."; exit 1; }