Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Template: Add a Github Action workflow to test a successful download of the pipeline #2618

Merged

Conversation

MatthiasZepper
Copy link
Member

@MatthiasZepper MatthiasZepper commented Dec 20, 2023

Motivation

When refactoring the 'nf-core download' functionality, I had to account frequently for edge cases in the container declarations. A particular revision of a single pipeline disrupted an otherwise functional solution. Peculiarities like arbitrarily mixed single and double quotes in the code had to be addressed, and often enough, fixing it for one case would render it defunct for another.

Up until now, the only safeguard against a growing heterogeneity was linting the pipeline code. However, if an unconventional module ever slipped through linting and became part of a released pipeline, nf-core download had to adapt. For this reason, the test_find_container_images_modules() and test__find_container_images_config_nextflow() were included in test_download.py.

As a companion to those tests, I propose integrating a Github Action workflow into the pipeline template. This way, every developer can ensure their new pipeline downloads smoothly. If not, developers have the option to modify tools instead of the pipeline. This is why I am using the dev branch of tools in the Action.

Since this test uses a lot of resources, it is only run when a new PR is opened to dev and master or manually triggered. Initially, I also wanted to rerun it with new pushes to open PRs, but I realized that this might be too exhaustive. A downside of this test is, that it downloads all containers, which may exceed the available 2GB space on the GitHub provided runners.

A working version of this has been added to my fork of the Testpipeline. Unfortunately, you might not have permissions to view the run details.

PR checklist

  • This comment contains a description of changes (with reason)
  • CHANGELOG.md is updated
  • If you've fixed a bug or added code that should be tested, add tests!
  • Documentation in docs is updated

@mashehu
Copy link
Contributor

mashehu commented Jan 4, 2024

FYI, you can ignore the failing cirun action.

…e a trigger as well. This only applies to last-minute fixes prior to a pipeline release.
Copy link
Member

@mirpedrol mirpedrol left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

MatthiasZepper and others added 2 commits January 8, 2024 15:10
@MatthiasZepper MatthiasZepper merged commit e8e6372 into nf-core:dev Jan 8, 2024
35 checks passed
@MatthiasZepper MatthiasZepper deleted the GithubAction-DownloadTest branch January 8, 2024 16:32
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -stub -profile test,singularity --outdir ./results
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's to stop Nextflow from downloading any missing singularity containers during run time here? eg. if the singularity image has been downloaded with an incorrect filename?

Tagging @mirpedrol @mashehu

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is a valid point, although the case of an incorrect filename should be caught by the respective tests in the tools repo.

Do you think that setting NXF_OFFLINE would help? Or should we count the files in the NXF_SINGULARITY_CACHEDIR before and after the pipeline run to corroborate that they are still the name number?

Generally, though, I am already very happy that there is now a download test in the first place, because we had multiple new releases for which nf-core download unexpectedly raised exceptions because some odd declaration slipped through the module linting and review. (Such things like arbitrarily mixing single and double quotes). If we can avoid publishing some crooked module as part of a pipeline release like this, I am already very happy!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is a valid point, although the case of an incorrect filename should be caught by the respective tests in the tools repo.

Good point 👍🏻

Do you think that setting NXF_OFFLINE would help?

We were chatting about that in the meeting where I added the comment. I'm not confident that it would 👀 😅 We were playing with the idea of cutting off networking access on the custom runner and stuff, but I think it's almost certainly overkill.

No need to do anything here, as you say - invalid container names is something that we should be testing in other tests. Testing for syntax errors / weirdness here is a good addition as it is 👍🏻

Copy link
Member Author

@MatthiasZepper MatthiasZepper Jan 23, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you cut networking access entirely, then the most likely test failure will be some issue with the plugins rather than the actual test subject, namely container images. 😅

I already searched a bit and there seems to be ways to block network traffic selectively, but only on a per job and not on a per-step basis. So if one disallows the registries, the containers would not download successfully beforehand either.

@edmundmiller edmundmiller added the download nf-core download label Jun 13, 2024
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
download nf-core download
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants