-
Notifications
You must be signed in to change notification settings - Fork 709
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add implicit workflow definition to main #619
Comments
Ah, now I see what you mean 👍🏽 It does make sense to maybe define things this way but I am wondering how often we will be including whole pipelines in other whole pipelines? We have to bear in mind that all of the modules/sub-workflows/utility scripts will also have to be copied across in the new pipeline context and with the way things are currently I think that will be way too much work than just running the original pipeline in isolation? |
The main scenario that I can think of where this feature will be useful is in benchmarking. A simple example could be to implement a pipeline that could compare results between two versions of a nf-core DSL2 pipeline. If we push it further, it should also be possible to benchmark a params {
// Input options
input = ''
pipeline = ''
path_to_pipelines = "${projectDir}/modules/pipelines"
pipeline_path = "${params.path_to_pipelines}/${params.pipeline}" // remove and directly
pipeline_test_config = "${params.pipeline_path}/conf/test.config"
pipeline_config = "${projectDir}/modules/pipelines/${params.pipeline}/nextflow.config"
// Benchmark related params
benchmarker_path = "${projectDir}/modules/benchmarkers"
skip_benchmark = false
...
}
// this in fact is a separate config file, but for the sake of simplicity I put it here
try {
includeConfig "${params.pipeline_config}"
} catch (Exception e) {
System.err.println("====================================================\n" +
"WARN: The included module pipeline `$params.pipeline`\n" +
" does not declare any 'nextflow.config' file.\n" +
" You can include it at `${params.path_to_pipelines}`\n" +
" or otherwise use `--pipeline_config` to set its path.\n"+
"====================================================\n")
} Then in the pipeline_module = file( "${params.pipeline_path}/main.nf" )
pipeline_module = file( "${params.pipeline_path}/main.nf" )
if( !pipeline_module.exists() ) exit 1, "ERROR: The selected pipeline is not correctly included: ${params.pipeline}"
include { PIPELINE } from pipeline_module #pipeline name can not be interpolated, script might be created dynamically Maybe there could be other scenarios where this approach could be used, for instance in the case of creating a I can not think of any collateral issue associated with the refactoring of the DSL2 |
Thanks @JoseEspinosa. As agreed on Slack there is no harm in calling the workflow as you suggested if it gives us more flexible options in the future to call entire pipelines within others as you nicely explained above. |
Added a link to this issue in the main script for now in #621 |
To be able to include a pipeline as a subworkflow of another pipeline in DSL2, the former pipeline needs to have an implicit workflow definition. This will need to refactor the workflow definition in the main.nf as shown below:
If I am not wrong, this change should not have any collateral impact on the pipeline.
Ping @ggabernet, @drpatelh and @ewels since this issue was started to be discussed on the Slack #modules channel.
The text was updated successfully, but these errors were encountered: