-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automating releases, building and publication of package artefacts #142
Comments
Initial work has been done to showcase automated releases and image builds for federated wiki using the GitHub infrastructure. After the manifests of the @federated-wiki initiative were not updated since beginning of the pandemic, they were in a good spot to exemplify the suggested procedure. Were, because they now support (semi-)automatic builds. A The Using the detect and tag new version (manifest) workflow
Using the CI workflow (manifest) directly allows to create interim releases, which are additional to the upstream versions.
This way it is easy to automatically track upstream wiki versions, and to offer pull requests that increase the package version, while allowing to create manual interim releases for different purposes, mainly patch and test releases. The The artefacts are uploaded to the Packages repository of the GitHub organisation, here Due to suggested snippets in the Docker, Node.js and GitHub documentation, the build is performed for both the Now it will be much easier and much more deterministic to create new versions of our federated wiki distribution for the local use cases, mainly the |
I see little benefit in adding a script that will be doing little more than simply automate I think for your upstream workflows, you are asking us to increment the wiki version when changes to any of the included wiki parts is updated. I don't see a problem with remembering to do that. Not sure how your automation copes when we package either wiki or any of its parts with a dist-tag |
Thanks for looking at this. Indeed my automation only picks up from what is published to these repositories, basically the Dockerfiles there, and has no integration whatsoever with the wiki upstream here or on NPM, yet. Marking releases as tags and pushing them here, eventually also using the GitHub releases feature on them, will only make it easier for third-parties to follow what's going on in Wiki land, and is not needed for the procedures above. I'd still consider NPM being the canonical source for Wiki releases, which could be polled by a daily job for changes, avoiding any tight integration, e.g. with Webhooks. Building always up-to-date images from What's left then is to increase wiki versions, when one of its downstream packages change their version, too, yes. The automation could pick this up with regular polling, as mentioned above. |
I have now set up a clunky extension of the previous example, which will query NPM once a day for new wiki versions, in comparison to the one available there, and build a new container from it, when necessary: This has been exemplified in this task run: It remains left to increase the base image version of the dependant distribution, once a new base image has been released. Now I also understand your remark regarding a mutable The script queries NPM ¹ with:
Whenever this is different to the previous version in the Dockerfile ², it is used for the next release:
There appear to be many ways in which this can break. For getting this right, I had to revert version increase commits multiple times, because the triggers were not working as expected. But the one that checks for new versions, which would also try to generate a new release for a downgraded version, but fail sufficiently due to the preexisting tag: Let's see if that schedule actually works, and if it manages to react to new releases on its own now. |
There are now automated releases for the packaged distribution of wiki. I can go to enter the old and the new version number, and will end up with new tagged releases for the distribution, both for the git tag and the OCI image tag. There is still room for improvement. For example building from the implicit There are other ways to update container images when their base images changes, but for now I like the explicitness of syncing the version tag to the wiki upstream of the distribution. Another next step will be to reactivate pushing to Docker Hub, which is more convenient to use for most. |
I liked what you said in #143 (comment)
Which totally makes sense, given that packaging exists and works. |
Just confirming that this works now as intended. The last release for appeared in the morning at which allowed me to manually initiate the upgrade with and a new container was born: Now that I have learned about the |
Hi! I have observed the Federated Wiki build toolchain is "missing" automated builds to NPM and the GitHub package repository or an OCI registry.
The current release processes depend on manual work by @paul90 in regular exchange with @WardCunningham. In times of Microsoft pleasing the world with GitHub Actions, we can choose to accept their offering and have it help us to automate releases "away", by reducing a few manual steps.
Upstream
This can benefit by a generalized workflow repository in the
fedwiki
organisation, from which manifests may be reused. It could carry this forge's brand in the.github
repository, be anactions
repository or carry any self-chosen name, maybe just here inwiki
with a little overhead in a.github
branded directory. Would there be opinions on this?When there are no objections to the overall procedure, I would like to proceed with:
wiki-client
releases trigger package build and publication for NPM and OCIwiki-client
release triggers the creation of a PR that offers to increase this dependency's version here.wiki-server
releases trigger package build and publication for NPM and OCIwiki-client
release triggers the creation of a PR that offers to increase this dependency's version here.Downstream
Release automation for
wiki
is in place thatLater we could reuse these preparations to have the plugin repositories behave in a similar way. Also self-updating draft releases, which aggregate recently merged PRs since last tag, can save a few clicks when doing bulk operations. In tying Wiki closer to GitHub, in return we receive some support to ease maintenance chores.
Paul and Ward, would the above seem useful to you?
The text was updated successfully, but these errors were encountered: