-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cargo eating all RAM during aarch64 build under Docker on x86-64 #10583
Comments
#8405 is one potential solution for this. |
Hm, I feel like this is something different because I only observe this issue in above scenario with QEMU-based emulation. Let me know if I can add some instrumentation to help debugging this (though reproduction steps should be reliable too as everything is containerized). |
I'm fairly certain that memory consumption happens during repo cloning (dependencies from git), before doing any compilation |
If it is related to git clone, perhaps set the config |
Thank you @weihanglo, it helped! |
I'm seeing the exact same behavior as OP, however setting the net git fetch flag didn't seem to reduce the memory consumption In my case, I was able to get around it by binding my cargo caches into the container:
I suppose the docker-compose.yml equivalent would work as well, but in some CI scenarios this may not be possible |
- CI | Another attempt to fix vaultwarden builds on emulated aarch64: rust-lang/cargo#10583
- GMediaRender | Updated to version 0.0.9 and aligned service name with Debian and upstream service and executable name. The update can be applied via reinstall: dietpi-software reinstall 163 - CI | Fix vaultwarden builds on emulated aarch64: rust-lang/cargo#10583 - Other minor updates to new software packages
- GMediaRender | Updated to version 0.0.9 and aligned service name with Debian and upstream service and executable name. The update can be applied via reinstall: dietpi-software reinstall 163 - CI | Fix vaultwarden builds on emulated aarch64: rust-lang/cargo#10583 - Other minor updates to new software packages
* Workaround rust-lang/cargo#10583 * Bump hyper from 0.14.20 to 0.14.22 Bumps [hyper](https://github.com/hyperium/hyper) from 0.14.20 to 0.14.22. - [Release notes](https://github.com/hyperium/hyper/releases) - [Changelog](https://github.com/hyperium/hyper/blob/v0.14.22/CHANGELOG.md) - [Commits](hyperium/hyper@v0.14.20...v0.14.22) --- updated-dependencies: - dependency-name: hyper dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> * Bump clap from 4.0.18 to 4.0.22 Bumps [clap](https://github.com/clap-rs/clap) from 4.0.18 to 4.0.22. - [Release notes](https://github.com/clap-rs/clap/releases) - [Changelog](https://github.com/clap-rs/clap/blob/master/CHANGELOG.md) - [Commits](clap-rs/clap@v4.0.18...v4.0.22) --- updated-dependencies: - dependency-name: clap dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com> * Run cargo update * (cargo-release) version 1.2.8 Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
References: - Background: rust-lang/cargo#10583 - Solution: Qiskit/rustworkx#713
Fix found here: rust-lang/cargo#10583 This problem seems to exist on aarch64 building for amd64 and vice-versa.
Ref: rust-lang/cargo#10583 (comment) rust-lang/cargo#10583 (comment) Bumped dependencies Re-enabled multi-arch image build
I just ran into this issue as well trying to execute |
Maybe related: pyca/cryptography#8640 |
libgit2 can consume a lot of memory when cross-compiling for arm64. As suggested here [1], let's use the git executable to prevent this issue. [1] rust-lang/cargo#10583 (comment) Fixes: rust-vmm#79 Signed-off-by: Stefano Garzarella <sgarzare@redhat.com>
libgit2 can consume a lot of memory when cross-compiling for arm64. As suggested here [1], let's use the git executable to prevent this issue. [1] rust-lang/cargo#10583 (comment) Fixes: #79 Signed-off-by: Stefano Garzarella <sgarzare@redhat.com>
…ing cross-compiling (#4828) ## Issue Addressed #4827 ## Proposed Changes This PR introduces a new build-arg to the Lighthouse Dockerfile: `CARGO_USE_GIT_CLI`. This arg will be passed into the `CARGO_NET_GIT_FETCH_WITH_CLI` [environment variable](https://doc.rust-lang.org/cargo/reference/config.html#netgit-fetch-with-cli), which instructs `cargo` to use the git CLI during `fetch` operations instead of the git library. Doing so works around [a bug](rust-lang/cargo#10583) with the git library that causes it to go OOM during `fetch` operations on `arm64` platforms. The default value is `false` so this doesn't affect Lighthouse builds or the CI pipeline. Running a build with `--build-arg CARGO_USE_GIT_CLI=true` will activate it, which is necessary to cross-compile the `arm64` binary when not using `cross` (i.e., when building via the Dockerfile instead of natively if you don't have a rust environment ready to go). Special thanks to @michaelsproul for helping me repro the initial problem. Co-authored-by: Michael Sproul <micsproul@gmail.com>
Cargo uses libgit2, which causes issues while building for aarch64 targets: rust-lang/cargo#10583 As this is one of many errors caused by libgit2, it is adviced to use the local git binary: https://docs.shipyard.rs/configuration/git-fetch-with-cli.html
Problem
I'm trying to build an application, this time to build aarch64 container image on x86-64 machine.
My machine is a beefy 5900X with 128G of RAM, but my system runs out of memory (Cargo eats it all) when cross-compiling aarch64 container.
Steps
docker buildx build --platform linux/arm64 -t test -f Dockerfile-farmer .
Possible Solution(s)
No response
Notes
x86-64 build on the same machine works fine with ~20G of system memory.
Not entirely sure if this is Cargo's fault or QEMU or something else, but I used similar QEMU-based setups before and never seen anything remotely like this.
Version
UPD: Tried the most recent version, still the same issue:
The text was updated successfully, but these errors were encountered: