Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overhaul GHA #1200

Merged
merged 29 commits into from
Sep 1, 2023
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
3957e7a
Tidy workflows into one testing workflow
adamrtalbot Aug 31, 2023
39e4f89
Tidy up AWS tests into single workflow
adamrtalbot Aug 31, 2023
5567e32
Condense linting workflows to single workflow
adamrtalbot Aug 31, 2023
6438b68
Add confirm-pass check to the end of the testing workflow
adamrtalbot Aug 31, 2023
83d81ea
[automated] Fix linting with Prettier
nf-core-bot Aug 31, 2023
a1436a9
Add Sentieon secret back in
adamrtalbot Aug 31, 2023
754684f
Fix tags for alignment checks
adamrtalbot Aug 31, 2023
77e800f
Add default test to list of existing tags
adamrtalbot Aug 31, 2023
5f341e2
Add merge group in case of a merge queue
adamrtalbot Aug 31, 2023
52bc779
Setup Cache for Nextflow and Pip
adamrtalbot Aug 31, 2023
204950a
Revert "Condense linting workflows to single workflow"
adamrtalbot Aug 31, 2023
eaac9e5
unique-ify the installation cache key
adamrtalbot Aug 31, 2023
c67d751
Use git commands instead of GHA CLI to check for path differences
adamrtalbot Aug 31, 2023
a64e540
Use explicit requirements.txt file
adamrtalbot Aug 31, 2023
1cced6d
Remove blank token from paths-filter
adamrtalbot Aug 31, 2023
2454384
lint requirements.txt
adamrtalbot Aug 31, 2023
a43de5a
Merge branch 'dev' of github.com:nf-core/sarek into overhaul_ci_gha
adamrtalbot Aug 31, 2023
827ef19
Update .github/workflows/pytest-workflow.yml
adamrtalbot Sep 1, 2023
15ec813
Match CI triggers to nf-core schema
adamrtalbot Sep 1, 2023
84dc775
CHANGELOG
adamrtalbot Sep 1, 2023
8f61be2
Rely on setup-python auto cacheing feature instead of manual cacheing
adamrtalbot Sep 1, 2023
439d837
fixup
adamrtalbot Sep 1, 2023
855c878
fixup
adamrtalbot Sep 1, 2023
d804599
Change order to install test data after software dependencies (fail e…
adamrtalbot Sep 1, 2023
919115f
Change triggers for AWS tests
adamrtalbot Sep 1, 2023
b40397c
Change triggers for AWS tests
adamrtalbot Sep 1, 2023
2866617
Change triggers for AWS tests
adamrtalbot Sep 1, 2023
aa9e189
Change triggers for AWS tests
adamrtalbot Sep 1, 2023
9342569
Change triggers for AWS tests
adamrtalbot Sep 1, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 0 additions & 37 deletions .github/workflows/awsfulltest.yml

This file was deleted.

34 changes: 0 additions & 34 deletions .github/workflows/awsfulltest_germline.yml

This file was deleted.

73 changes: 70 additions & 3 deletions .github/workflows/awstest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,24 @@ name: nf-core AWS test

on:
workflow_dispatch:
inputs:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nice I like it, much cleaner to have it all here

profiletest:
description: "Trigger profile tests (smaller) on AWS"
type: boolean
default: true
somatic:
description: "Trigger somatic full test on AWS"
type: boolean
default: false
germline:
description: "Trigger germline full test on AWS"
type: boolean
default: false

jobs:
run-tower:
trigger-profile-test:
name: Run AWS tests
if: github.repository == 'nf-core/sarek'
if: github.repository == 'nf-core/sarek' || inputs.profiletest == 'true'
runs-on: ubuntu-latest
steps:
# Launch workflow using Tower CLI tool action
Expand All @@ -26,7 +40,60 @@ jobs:
profiles: test
- uses: actions/upload-artifact@v3
with:
name: Tower debug log file
name: tower-profiletest-log
path: |
tower_action_*.log
tower_action_*.json

trigger-full-test-somatic:
name: Run AWS full tests
if: ( github.repository == 'nf-core/sarek' && github.event_name == 'release' ) || inputs.somatic == 'true'
runs-on: ubuntu-latest
steps:
# Launch workflow using Tower CLI tool action
- name: Launch workflow via tower
uses: seqeralabs/action-tower-launch@v2
with:
workspace_id: ${{ secrets.TOWER_WORKSPACE_ID }}
access_token: ${{ secrets.TOWER_ACCESS_TOKEN }}
compute_env: ${{ secrets.TOWER_COMPUTE_ENV }}
revision: ${{ github.sha }}
workdir: s3://${{ secrets.AWS_S3_BUCKET }}/work/sarek/work-${{ github.sha }}/somatic_test
parameters: |
{
"hook_url": "${{ secrets.MEGATESTS_ALERTS_SLACK_HOOK_URL }}",
"outdir": "s3://${{ secrets.AWS_S3_BUCKET }}/sarek/results-${{ github.sha }}/somatic_test"
}
profiles: test_full

- uses: actions/upload-artifact@v3
with:
name: tower-full-somatic-log
path: |
tower_action_*.log
tower_action_*.json

trigger-full-test-germline:
name: Run AWS full tests
if: ( github.repository == 'nf-core/sarek' && github.event_name == 'release' ) || inputs.germline == 'true'
runs-on: ubuntu-latest
steps:
# Launch workflow using Tower CLI tool action
- name: Launch workflow via tower
uses: seqeralabs/action-tower-launch@v2
with:
workspace_id: ${{ secrets.TOWER_WORKSPACE_ID }}
access_token: ${{ secrets.TOWER_ACCESS_TOKEN }}
compute_env: ${{ secrets.TOWER_COMPUTE_ENV }}
revision: ${{ github.sha }}
workdir: s3://${{ secrets.AWS_S3_BUCKET }}/work/sarek/work-${{ github.sha }}/germline_test
parameters: |
{
"hook_url": "${{ secrets.MEGATESTS_ALERTS_SLACK_HOOK_URL }}",
"outdir": "s3://${{ secrets.AWS_S3_BUCKET }}/sarek/results-${{ github.sha }}/germline_test"
}
profiles: test_full_germline
- uses: actions/upload-artifact@v3
with:
name: tower-full-germline-log
path: tower_action_*.log
129 changes: 0 additions & 129 deletions .github/workflows/ci.yml

This file was deleted.

60 changes: 35 additions & 25 deletions .github/workflows/pytest-workflow.yml
Original file line number Diff line number Diff line change
@@ -1,8 +1,18 @@
name: pytest-workflow
name: test
# This workflow runs the pipeline with the minimal test dataset to check that it completes without any syntax errors
on:
pull_request:
branches: [dev]
branches:
- master
- dev
release:
types:
- published
merge_group:
types:
- checks_requested
branches:
- master

# Cancel if a newer run is started
concurrency:
Expand Down Expand Up @@ -32,32 +42,13 @@ jobs:
fail-fast: false
matrix:
tags: ["${{ fromJson(needs.changes.outputs.tags) }}"]
profile: ["docker"]
# profile: ["docker", "singularity", "conda"]
profile: ["docker", "singularity"]
TEST_DATA_BASE:
- "test-datasets/data"
NXF_VER:
- "23.04.0"
- "latest-everything"
exclude:
# - profile: "conda"
# tags: concatenate_vcfs
# - profile: "conda"
# tags: deepvariant
# - profile: "conda"
# tags: haplotypecaller
# - profile: "conda"
# tags: merge
# - profile: "conda"
# tags: snpeff
# - profile: "conda"
# tags: umi
# - profile: "conda"
# tags: validation_checks
# - profile: "conda"
# tags: vep
# - profile: "conda"
# tags: sentieon/bwamem
- profile: "singularity"
tags: concatenate_vcfs
- profile: "singularity"
Expand Down Expand Up @@ -107,14 +98,14 @@ jobs:
python-version: "3.x"

- uses: actions/cache@v3
id: python-cache-setup
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements.txt') }}
restore-keys: |
adamrtalbot marked this conversation as resolved.
Show resolved Hide resolved
${{ runner.os }}-pip-

- name: Install Python dependencies
run: python -m pip install --upgrade pip pytest-workflow cryptography
if: steps.python-cache-setup.outputs.cache-hit != 'true'
run: python -m pip install --upgrade -r tests/requirements.txt

- name: Install Nextflow ${{ matrix.NXF_VER }}
uses: nf-core/setup-nextflow@v1
Expand Down Expand Up @@ -175,3 +166,22 @@ jobs:
/home/runner/pytest_workflow_*/*/work
!/home/runner/pytest_workflow_*/*/work/conda
!/home/runner/pytest_workflow_*/*/work/singularity

confirm-pass:
runs-on: ubuntu-latest
needs:
- test
if: always()
steps:
- name: All tests ok
if: ${{ success() || !contains(needs.*.result, 'failure') }}
run: exit 0
- name: One or more tests failed
if: ${{ contains(needs.*.result, 'failure') }}
run: exit 1

- name: debug-print
if: always()
run: |
echo "toJSON(needs) = ${{ toJSON(needs) }}"
echo "toJSON(needs.*.result) = ${{ toJSON(needs.*.result) }}"
Loading
Loading