-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rework Actions to work well with Merge Queues and Enable Benchmarks in there #525
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
franziskuskiefer
approved these changes
Aug 22, 2024
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is doing what it's supposed to.
Though, I have been thinking that before as well 😉
Let's give it a try!
github-merge-queue
bot
removed this pull request from the merge queue due to no response for status checks
Aug 22, 2024
github-merge-queue
bot
removed this pull request from the merge queue due to no response for status checks
Aug 26, 2024
had to move the utils into the crate, because otherwise it would be picked up by the bechmarks and fail because libtest would again complain about the --output-format argument
github-merge-queue
bot
removed this pull request from the merge queue due to failed status checks
Aug 26, 2024
github-merge-queue
bot
removed this pull request from the merge queue due to failed status checks
Aug 26, 2024
github-merge-queue
bot
removed this pull request from the merge queue due to failed status checks
Aug 27, 2024
github-merge-queue
bot
removed this pull request from the merge queue due to failed status checks
Aug 27, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR reworks the actions a little bit to work well with merge queues, and running the benchmarks in the merge queues. Since this is difficult to test, this PR will probably see some more commits :)
In order to get the structure, let's go over the way the rules work. There is a list of required checks, which is just a list of strings. Github now requires that at least one job of that name ran, and that all jobs of that name succeeded. It does not care whether these are actually the same jobs.
There now are three classes of workflow files:
So far the problem was that in the merge queue, github only runs the jobs that have a name that is listed in the required checks list. This makes it difficult to run something only in the merge queue, because it would then also run in the PRs, or the PR checks wouldn't go because it's blocked on a check that won't run. But remember: github doesn't care whether these are the same checks! It only cares that there are checks that have the right name, and that the are successful, even if there is just one check of that name.
We only use the
on
setting of the workflow to configure which events trigger the jobs therein.We now do the following:
on: [ merge_group, workflow_dispatch ]
This way, the PR checks are satisfied, and when in the merge queue, it runs the benchmarks.
Oh, and this PR also adds the benchmark html generation step.
Working towards #54.
Implementation note: We have to add
to the Cargo.toml of all the crates with criterion benchmarks that we want to track in a graph. The reason is that the graph wants the benchmarks in a special format and we have to pass
--output-format bencher
. However, by defaultcargo bench
only accepts command line arguments that libtest also accepts. It seems the only way is to ask libtest to not handle benchmarks. Cf. the criterion faq