You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to hash the contents of https://github.com/cdnjs/cdnjs so I can detect and rewrite all in-browser requests for cdnjs.cloudflare.com to go to IPFS and be loaded from my local cache or a nearby computer. It's about 29GB of highly duplicated content and contains 2.5 million files. The command I'm using is ipfs add -r -H cdnjs/ajax/libs/ and the ipfs version is 0.4.2.
I tried running this with the IPFS daemon, but it failed with 10:29:48.139 ERROR commands/h: unexpected EOF client.go:247 after running all night. This morning, I killed the daemon and ran the same command again. It seems to be more reliable and faster without the daemon, but after a few hours it used up all the RAM on my system, slowed down, and I had to kill it:
As a workaround, I suppose I could hash it in chunks and then add them together with mfs, but 29GB doesn't seem like an unreasonable amount of data to add at once.
The text was updated successfully, but these errors were encountered:
go-ipfs 0.4.2 has pretty severe leak, it was fixed on current master but there is another one, we are tracking it in #2823 , also looks like you are duplicating the effort as @magik6k took over my job on that and is also adding cdnjs: ipfs-inactive/archives#35
I will close it now as the issues is tracked elsewhere already.
I'm trying to hash the contents of https://github.com/cdnjs/cdnjs so I can detect and rewrite all in-browser requests for
cdnjs.cloudflare.com
to go to IPFS and be loaded from my local cache or a nearby computer. It's about 29GB of highly duplicated content and contains 2.5 million files. The command I'm using isipfs add -r -H cdnjs/ajax/libs/
and the ipfs version is0.4.2
.I tried running this with the IPFS daemon, but it failed with
10:29:48.139 ERROR commands/h: unexpected EOF client.go:247
after running all night. This morning, I killed the daemon and ran the same command again. It seems to be more reliable and faster without the daemon, but after a few hours it used up all the RAM on my system, slowed down, and I had to kill it:As a workaround, I suppose I could hash it in chunks and then add them together with mfs, but 29GB doesn't seem like an unreasonable amount of data to add at once.
The text was updated successfully, but these errors were encountered: