-
-
Notifications
You must be signed in to change notification settings - Fork 158
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RFC] Revisiting our CI setup: GH Actions, RTD #689
Comments
Here's the GHA setup I use in flask-smorest. It uses a |
That's a good approach! I've been somewhat disappointed with the publicly available actions for publishing to pypi in the past. They just don't seem to offer very much in exchange for another dependency in the critical path to publishing. |
From #687 , there was a question of how to make the release step depend on linting. In the flask-smorest build-release workflow, this is done with I think the simplest thing is to have a near-verbatim copy of the flask-smorest release workflow. So there's some minor duplicate linting, but we make sure a release passes checks. |
On a related topic, I've been using pip-tools recently (https://github.com/BEMServer/bemserver-core, https://github.com/BEMServer/bemserver-api), inspired by Pallets repositories and API Flask [1]. I like the fact that the build can be reproducible with all dependencies pinned recursively. I did it a bit differently, with a pre-commit action ensuring pip-compile is run when needed. I find it nice, although not perfect. Not perfect for Pallets either, apparently: pallets/werkzeug#2334 What I dislike is the fact that everything is pinned but pre-commit and tox. Maybe it's not that bad. To actually pin them, I'd need to add dedicated requirement files. Or add them to dev.in and install dev.txt in GHA and tox.ini, thus installing unneeded stuff (no need to install pre-commit in GHA tox action and no need to install tox inside tox task). I decided to just let it go, like they do in Pallets. After all, those are just tools, not library imported in my code. Maybe I'm overthinking it. [1] When it comes to CI, marshmallow-code and Pallets are my sources of inspiration. |
Also related, we're facing CI limitations in apispec so I'm probably going to move to GHA as soon as I get the time. |
I've put in a PR for this, with most of what I want in place.
I have used pip-tools to deploy applications in the past -- and liked it -- but never as part of working on a library. I've also had good experiences using
💯 ! I agree! I also tend to look at what the pypa and psf repos have, as they often seem to do sophisticated and interesting things.
I think it's good to think about. At this point, I would actually recommend against pinning things that don't fit neatly into the common pattern of "dev dependencies". I used to try to pin these sorts of things aggressively in my workflows, and I've stopped after finding that I wasn't getting very much out of it other than busywork. e.g. I used to pin the version of We might need to re-assess after the tox rewrite is done and tox v4 is released, but I'm hoping that all of our configs will be supported and that we'll not need to make any changes. |
Yeah, exactly my point. Even if it breaks with tox 4, it should be obvious what the problem is and we'll fix it then. No need to add extra work for each minor and patch version. What can be weird with the setup in Pallets that I copied (IIUC) is that stuff in requirements/dev.in is pinned, but the versions of tox and pre-commit are actually not the version used during the test. |
We now use pre-commit.ci which uses a specific pre-commit image, but pre-commit has other guarantees about compatibility and pinning hooks. Now that you mention it, tox not being pinned was an oversight. With pip-compile-multi, it's absolutely fine to split up into lots of small requirements files, like one for a pre-commit env (again, check out pre-commit.ci), one for tox, etc. Although I guess it turns out it wasn't a big deal that it was unpinned, it hasn't caused any issues since I started doing all this. The nice part about having all dev dependencies pinned is that it's much easier to ensure new contributors have a known-good environment. This was important at conference sprints, where we'd need to get lots of people set up quickly and not run into weird version differences.
Just to be clear, we're fine with pip-tools, pip-compile-multi is a wrapper around it that automates updating lots of requirements files. What I was unhappy with was Dependabot, which was noisy and also only worked specifically with the output of plain pip-compile, not the slightly different output of pip-compile-multi. |
I think we just forgot about this issue. Closing as complete (doing some open issue review right now). |
In #687 , we started to discuss our more general setup in terms of Azure Pipelines vs GitHub Actions (GH Actions).
A couple of us support the switch to GH Actions, and there's no strong objection to it. I'd like to try that out.
The setup I want to try would be something like a reusable workflow to run
tox
, in the repo, and then a job namedbuild
which runs the tox workflow over a matrix of python version, os, and tox env names. Now that reusable workflows supports using a local file, we can develop something inwebargs
until we like the result. If we then want to move it to a separate project likemarshmallow-code/reusable-github-workflows
, we can discuss and/or do that.There
are two notable jobsis one job in Azure Pipelines which I don't think belongs in the move to GH Actions:tox -e docs
pypi publish(done in Replace Azure Pipelines with GitHub Actions #690)For the docs, ReadTheDocs (RTD) offers a relatively new feature to build on PRs; I recall we discussed it when it was still in beta. I like this primarily because it's driven by the
.readthedocs.yml
in the repo, so we don't run into trouble where ourtox -e docs
run in CI differs from what happens in RTD. It also means that if we have a PR to update.readthedocs.yml
config, it will be tested in the PR build.All we have to do is enable this feature in the readthedocs.org admin pane and make sure that the RTD webhook is set to send Pull Request events. A trial PR could demonstrate that this works -- e.g. by updating the python version used for RTD.
For pypi publishing, I think we can leave this in Azure for the initial move to GitHub Actions. It should be possible to setup a tag->publish reusable workflow, but I want to get more "basic" CI worked out before taking a crack at this.(done in #690)With #690 now merged, we have a short "remaining" TODO list:
The text was updated successfully, but these errors were encountered: