How do you deal with large packages checked in .yarn/cache
with compressionLevel: 0
?
#5921
Unanswered
stovmascript
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm updating our
.yarn/cache
and would like to migrate to the defaultcompressionLevel
of0
as it makes sense and truly reduces the size being transferred over the wire during checkout.Some of our dependencies are quite large though and I'm already running into warnings and errors from GitHub's file size limits:
I'm curious how would you approach this?
The files that result in error are native builds, which I'm thinking of ignoring from the cache altogether, as they should be fetched, prior to a build on the host machine, albeit breaking the offline mirror feature a bit.
Another possibility would be to track everything +50M with LFS, but as I understand there are some transfer limits and also it's currently not possible to track in LFS by file size. It also seems kind of weird to track files which are available on a public registry.
Beta Was this translation helpful? Give feedback.
All reactions