Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reduce CI time #5713

Open
zanieb opened this issue Aug 1, 2024 · 13 comments
Open

Reduce CI time #5713

zanieb opened this issue Aug 1, 2024 · 13 comments
Labels
testing Internal testing of behavior tracking A "meta" issue that tracks completion of a bigger task via a list of smaller scoped issues.

Comments

@zanieb
Copy link
Member

zanieb commented Aug 1, 2024

Running tests in CI is now >8 minutes (and was previously <1 minute). We've done a lot to optimize this previously, e.g.:

But, CI time is always growing as we add more features and coverage. This is a tracking issue to improve the situation and discuss sources of slowness.

See also:

@zanieb zanieb added testing Internal testing of behavior tracking A "meta" issue that tracks completion of a bigger task via a list of smaller scoped issues. labels Aug 1, 2024
@zanieb
Copy link
Member Author

zanieb commented Aug 1, 2024

Would it be absurd to introduce "CI" time regression checks in CI like the CodSpeed benches? Unfortunately the GitHub Runner performance is super noisy so it might not work.

@helderco
Copy link

helderco commented Aug 1, 2024

Wonder if Dagger could help here 🙂

@samypr100
Copy link
Collaborator

samypr100 commented Aug 1, 2024

Arguably self-hosted runners could possibly help here too, but the maintenance/security/cost burden is likely too large

@zanieb
Copy link
Member Author

zanieb commented Aug 1, 2024

I strongly considered self-hosted runners, but it seemed painful to orchestrate Windows runners in particular.

zanieb added a commit that referenced this issue Aug 1, 2024
Part of #5713

Shaves 50s or ~25% off the Ubuntu test run. Maybe 30s or 8% off macOS.
Windows already uses the GitHub distributions.

Note this is some of our only test coverage for Python version installs,
we may want to add separate coverage to compensate.
@zanieb
Copy link
Member Author

zanieb commented Aug 2, 2024

I also very much want to look into something like #609 again to cache our network traffic — I think that'd help a lot.

@samypr100
Copy link
Collaborator

samypr100 commented Aug 2, 2024

I also very much want to look into something like #609 again to cache our network traffic — I think that'd help a lot.

At least for python, maybe worth using something like https://github.com/hauntsaninja/nginx_pypi_cache and saving a copy of the cache as a github cache and reloading it. I actually use it locally a lot to speed up tests. Not sure how it would perform in CI.

@charliermarsh
Copy link
Member

That strikes me as a good idea...

@zanieb
Copy link
Member Author

zanieb commented Aug 2, 2024

Ah that might be easier than using mitmproxy or rolling our own proxy in Rust. Thanks for the link!

edit: I created a published image at https://github.com/astral-sh/nginx_pypi_cache/pkgs/container/nginx_pypi_cache we can use in our jobs if someone wants to trial it in Ubuntu (or locally even, to start)

@zanieb
Copy link
Member Author

zanieb commented Aug 7, 2024

Linux is acceptably fast now. We're at the limit for macOS machine size without going to alternative runner providers. There are larger Windows runners, maybe I should test one (#5890). Cost may be a problem at some point, alternative runner providers may be cheaper (often saying things like a 2x cost reduction).

zanieb added a commit that referenced this issue Aug 7, 2024
This saves about 10-20s

Part of #5713
zanieb added a commit that referenced this issue Aug 8, 2024
Might be pushing it on test coverage, but these are some of our slowest
tests we might get a significant speedup here.

Part of #5713
@ChannyClaus
Copy link
Contributor

have we considered using https://bazel.build/? (correct me if i'm wrong but cargo test doesn't seem to skip the tests when none of their dependencies changed)

admittedly adding bazel would significantly increase the complexity though 🙀

@eth3lbert
Copy link
Contributor

Yes, I've heard a lot about complaints regarding bazel's complexity.

@ChannyClaus
Copy link
Contributor

Yes, I've heard a lot about complaints regarding bazel's complexity.

this is definitely true - i've seen it improving the CI time drastically at the same time though so maybe something to consider down the line if there's no other option...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
testing Internal testing of behavior tracking A "meta" issue that tracks completion of a bigger task via a list of smaller scoped issues.
Projects
None yet
Development

No branches or pull requests

6 participants