-
Notifications
You must be signed in to change notification settings - Fork 331
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update CHANGELOG.md with new limits on uploading SARIF #1493
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wonder if we can indicate that this is a server-side change in some way. Specifically, we don't want GHES users who use GitHub Connect pulling this in and expecting the maximum number of SARIF runs per file to increase.
Good point. I can specifically call this out to being dotcom only. Though, that raises a question: Will the change be available in the next release of GHES? |
API change has been released, releasing the API docs' update in an hour (but I think that docs then take a couple more hours to be updated after the release). |
Co-authored-by: Jenny Rukman <jennyrocku@github.com>
Thanks for the reviews everyone. If this is good, I think it's safe to merge into main so it will be available for the next release. Can someone give this another review and approval? |
@JennyRockU Sorry for the double ping — would you mind confirming that this change will land in GHES 3.9? Thanks! |
Yes, it should go into the GHES 3.9 version. |
Remove apiVersion parameter.
Good point. I removed it. |
I can confirm that the API change in now live in the docs: |
I have been running into issues due to this change. My action to upload the sarif file, fails. Below is the action
The error that I am seeing in the workflow run is Error: Code Scanning could not process the submitted SARIF file: How do I fix this ? |
Hello @patiupe, can you please open a new issue for this? This change has bumped the limit up from 15 runs to 20 runs, so 30 runs is still too many runs for us to handle. In the new issue someone can help you around this issue. |
Great to see the limit increased, but its still too low. For us the limit of 20 is actually a dealbreaker as well. And my company is not alone. And after some Googling, I found folks that were much more patient than I am even split their pipeline up into an entire matrix of chunks to work around the same limitation: https://github.com/vmware-tanzu/community-edition/actions/runs/2253393437/workflow . Monorepos are a thing and 20 is just too limited. And it's also just a very unnecessary constraint. It isn't actually considering the number of violations inside the check run or the file size or any meaningful metric that actually affects the “complexity” or “processing time”. It seems a typical case of premature optimization. So far my support tickets and talks to my solutions engineer/sales rep have not been particularly fruitful. I know this is just the issue tracker for the action and not the backend, but hopefully we can get this constraint lifted one day. |
Thanks for your feedback @jwgmeligmeyling. I understand the SARIF run limit can be difficult for monorepos and I've recorded your feedback for prioritisation in the future. We impose this limit for technical reasons, including scalability impact. We aren't currently working on increasing this limit, but I will keep you updated if this changes and we prioritize this request. |
https://github.com/OpenMined/PySyft/actions/runs/5572810151/jobs/10179266350 Error: Code Scanning could not process the submitted SARIF file: I don't understand what I am supposed to do here? Is it the number of errors in this one sarif file? Do I need to fix this bugs on my machine first, or clear the entire set of security issues in github first? |
Every sarif file contains one or more runs. Typically just a single run, so the issue is probably the number of sarif files uploaded. You can upload a generous 20 runs maximally (so 20 files with a single run). Sorry for bringing bad news! 😊maybe with the advancements in quantum computing, someday we can process 21 runs simultaneously. |
@jwgmeligmeyling is correct. This is a restriction on the number or runs allowed in a SARIF file. To get around this limitation, you can split the file into chunks up <= 20 runs and upload individually. For each upload, you should specify a unique and consistent category property. |
Hello, i got the same issue as @madhavajay i can not upload my 68 sarif files to github by using the So i tried to to merge my sarif files with Js code inside a custom action but this didn't worked because i received 422 response from the Github API. I also tried to use sarif-sdk from microsoft but without success. Might you have an alternative to this problem or no ? If you have nothing to propose, is it now in your plan to inscrease this uplaod limit ? @jwgmeligmeyling Did you find a solution for your monorepo ? |
At this point I feel like you guys are just provoking me. 😉 No I did not find a solution, I cancelled GHAS after my trail period. |
@sacha-athias-wmx, please create a new issue for this and we can help you out. As I mentioned above, I recommend that you split your SARIF into separate files and run the upload for each file separately. Each file should be uploaded with a unique category to disambiguate with other files. |
Let's not merge this until the backend change is available.