You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# start a new shell with a private mount namespace (Linux-only)
sudo unshare -m bash
largefile=...
# find the right store path
storepath=$(nix-store --print-fixed-path sha256 $(nix-hash --type sha256 --flat --base32 $largefile) $(basename $largefile)) && echo $storepath
# copy/move the file there (as root)
mount -o remount,rw /nix/store
mv $largefile $storepath
# register the file (also as root)
printf "$storepath\n\n0\n" | nix-store --register-validity --reregister
# exit to shell where /nix/store is still mounted read-only
exit
Describe the bug
I wanted to use Nix to package machine learning data sets.
One example is the COCO dataset (http://cocodataset.org/#download). The
train2017.zip
archive is 18 GiB.When I use
nix-prefetch-url
, it downloads the entire file, but then during the calculation of the hash, it results in:Expected behavior
I'd expect that Nix can handle arbitrarily large files. Surely there shouldn't be a memory limit on the kind of packages we can work with.
Nix version is:
The text was updated successfully, but these errors were encountered: