Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Config restructure #621

Draft
wants to merge 12 commits into
base: dev
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 7 additions & 41 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -55,18 +55,14 @@ jobs:
strategy:
matrix:
# Run remaining test profiles with minimum nextflow version
profile:
[
test_host_rm,
test_hybrid,
test_hybrid_host_rm,
test_busco_auto,
test_ancient_dna,
test_adapterremoval,
test_binrefinement,
test_virus_identification,
profile: [
test_nothing, ## fast config to kill all other jobs if something fundamentally wrong
test_single_end,
test_concoct,
test_alternatives,
test_preassembly_binrefine, ## TODO CHE KOUTPUT
test_hybrid_host_rm,
#test_extras,
#test_bigdb,
]
steps:
- name: Free some space
Expand All @@ -85,33 +81,3 @@ jobs:
- name: Run pipeline with ${{ matrix.profile }} test profile
run: |
nextflow run ${GITHUB_WORKSPACE} -profile ${{ matrix.profile }},docker --outdir ./results

checkm:
name: Run single test to checkm due to database download
# Only run on push if this is the nf-core dev branch (merged PRs)
if: ${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-core/mag') }}
runs-on: ubuntu-latest

steps:
- name: Free some space
run: |
sudo rm -rf "/usr/local/share/boost"
sudo rm -rf "$AGENT_TOOLSDIRECTORY"

- name: Check out pipeline code
uses: actions/checkout@v2

- name: Install Nextflow
run: |
wget -qO- get.nextflow.io | bash
sudo mv nextflow /usr/local/bin/

- name: Download and prepare CheckM database
run: |
mkdir -p databases/checkm
wget https://data.ace.uq.edu.au/public/CheckM_databases/checkm_data_2015_01_16.tar.gz -P databases/checkm
tar xzvf databases/checkm/checkm_data_2015_01_16.tar.gz -C databases/checkm/

- name: Run pipeline with ${{ matrix.profile }} test profile
run: |
nextflow run ${GITHUB_WORKSPACE} -profile test,docker --outdir ./results --binqc_tool checkm --checkm_db databases/checkm
36 changes: 0 additions & 36 deletions conf/test_adapterremoval.config

This file was deleted.

31 changes: 15 additions & 16 deletions conf/test_bbnorm.config → conf/test_alternatives.config
Original file line number Diff line number Diff line change
@@ -1,41 +1,40 @@
/*
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Nextflow config file for running minimal tests
Nextflow config file for running minimal tests of alternative tools from default
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Defines input files and everything required to run a fast and simple pipeline test.
Defines input files and everything required to run a fast and simple pipeline test
of all alternative tools from a default test run.

Use as follows:
nextflow run nf-core/mag -profile test,<docker/singularity> --outdir <OUTDIR>
nextflow run nf-core/mag -profile test_alternatives,<docker/singularity> --outdir <OUTDIR>

----------------------------------------------------------------------------------------
*/

params {
config_profile_name = 'Test profile'
config_profile_description = 'Minimal test dataset to check pipeline function'
config_profile_name = 'Test alternatives profile'
config_profile_description = 'Minimal test dataset with alternative tools to check pipeline function'

// Limit resources so that this can run on GitHub Actions
max_cpus = 2
max_memory = '6.GB'
max_time = '6.h'

// Input data
input = params.pipelines_testdata_base_path + 'mag/samplesheets/samplesheet.csv'
keep_phix = true
skip_clipping = true
skip_prokka = true
skip_prodigal = true
skip_quast = true
skip_binning = true
// Input data -> Defaults
input = params.pipelines_testdata_base_path + 'mag/samplesheets/samplesheet.multirun.csv'
centrifuge_db = params.pipelines_testdata_base_path + 'mag/test_data/minigut_cf.tar.gz'
kraken2_db = params.pipelines_testdata_base_path + 'mag/test_data/minigut_kraken.tgz'
skip_krona = true
skip_krona = false
min_length_unbinned_contigs = 1
max_unbinned_contigs = 2
busco_db = "https://busco-data.ezlab.org/v5/data/lineages/bacteria_odb10.2024-01-08.tar.gz"
busco_clean = true
skip_gtdbtk = true
gtdbtk_min_completeness = 0
bbnorm = true
coassemble_group = true
skip_concoct = true

// Alternate tools from default test
clip_tool = 'adapterremoval'
binqc_tool = 'checkm'
bin_domain_classification = true // i.e., run tiara
}
42 changes: 0 additions & 42 deletions conf/test_ancient_dna.config

This file was deleted.

Empty file added conf/test_bigdb.config
Empty file.
43 changes: 0 additions & 43 deletions conf/test_concoct.config

This file was deleted.

Empty file added conf/test_extras.config
Empty file.
31 changes: 0 additions & 31 deletions conf/test_host_rm.config

This file was deleted.

30 changes: 0 additions & 30 deletions conf/test_hybrid.config

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -22,17 +22,22 @@ params {
// Input data
input = params.pipelines_testdata_base_path + 'mag/samplesheets/samplesheet.csv'
assembly_input = params.pipelines_testdata_base_path + 'mag/samplesheets/assembly_samplesheet.csv'
centrifuge_db = params.pipelines_testdata_base_path + 'mag/test_data/minigut_cf.tar.gz'
kraken2_db = params.pipelines_testdata_base_path + 'mag/test_data/minigut_kraken.tgz'
skip_krona = true
min_length_unbinned_contigs = 1
max_unbinned_contigs = 2
skip_metaeuk = false
metaeuk_db = 'https://github.com/nf-core/test-datasets/raw/modules/data/proteomics/database/yeast_UPS.fasta'
run_virus_identification = true
genomad_splits = 4
busco_db = "https://busco-data.ezlab.org/v5/data/lineages/bacteria_odb10.2024-01-08.tar.gz"
skip_gtdbtk = true
gtdbtk_min_completeness = 0
refine_bins_dastool = true
refine_bins_dastool_threshold = 0
// TODO not using 'both' until #489 merged
postbinning_input = 'refined_bins_only'
busco_clean = true
run_gunc = true

// For runtime reasons
skip_prokka = true // CONCOCT makes hundreds of bins, and Prokka is slow (keeping on runs for 28m, without 22m),
}
1 change: 0 additions & 1 deletion conf/test_single_end.config
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@ params {
binning_map_mode = 'own'
min_length_unbinned_contigs = 1000000
max_unbinned_contigs = 2
skip_gtdbtk = true
skip_concoct = true
skip_binqc = true
skip_gtdbtk = true
Expand Down
43 changes: 0 additions & 43 deletions conf/test_virus_identification.config

This file was deleted.

2 changes: 1 addition & 1 deletion modules.json
Original file line number Diff line number Diff line change
Expand Up @@ -219,7 +219,7 @@
},
"prokka": {
"branch": "master",
"git_sha": "911696ea0b62df80e900ef244d7867d177971f73",
"git_sha": "49ebda931c36c2b282f7958d00e1236b751f1031",
"installed_by": ["modules"]
},
"pydamage/analyze": {
Expand Down
7 changes: 7 additions & 0 deletions modules/nf-core/prokka/environment.yml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading
Loading