-
Notifications
You must be signed in to change notification settings - Fork 388
net::ERR_INSUFFICIENT_RESOURCES with many files in precache #118
Comments
Chrome will, once the device-specific storage limits are reached, start returning errors like the one you're seeing. And just in general, it's a really unfriendly approach to have anyone who visits your site once start downloading gigabytes of pictures. You should rethink your strategy here, and not use |
Yes, I have 5gb of pictures (I work on an 3D webapp). I don't understand really "net::ERR_INSUFFICIENT_RESOURCES", I think this error is due to many http get in same time ? If yes, is it possible to make a batch ? Or, else I put just important files in staticFileGlobs, other files in runtimeCaching and prefetch manually with importScripts right ? But it's a pity, because it's almost the same thing that to put all in staticFileGlobs. no ? |
I'm not sure whether I'll reopen this to investigate at some point down the line whether there are any benefits to batching the fetch/cache operations into instead of attempting them all at once, but given that it might not be possible to support this use case efficiently, it's not a high priority right now. Feel free to experiment using the service worker and Cache Storage APIs directly and see if you're able to accomplish what you're attempting. |
I was having this issue with this PWA: https://porrasta-cfb21.firebaseapp.com/ which caches 200+ files. Batching fetch/cache every 20 reqs solved it: self.addEventListener('install', function(event) {
event.waitUntil(
caches.open(cacheName).then(function(cache) {
return setOfCachedUrls(cache).then(async function(cachedUrls) {
const cacheKeys = Array.from(urlsToCacheKeys.values());
const chunckSize = 20;
const cacheKeysChunks = new Array(Math.ceil(cacheKeys.length / chunckSize)).fill().map(_ => {
return cacheKeys.splice(0, chunckSize);
});
for (let cacheKeys of cacheKeysChunks) {
await Promise.all(
cacheKeys.map(function(cacheKey) {
// If we don't have a key matching url in the cache already, add it.
if (!cachedUrls.has(cacheKey)) {
var request = new Request(cacheKey, {credentials: 'same-origin'});
return fetch(request).then(function(response) {
// Bail out of installation unless we get back a 200 OK for
// every request.
if (!response.ok) {
throw new Error('Request for ' + cacheKey + ' returned a ' +
'response with status ' + response.status);
}
return cleanResponse(response).then(function(responseToCache) {
return cache.put(cacheKey, responseToCache);
});
});
}
})
);
}
});
}).then(function() {
// Force the SW to transition from installing -> active state
return self.skipWaiting();
})
);
}); |
I've got the same issue with my game: https://david.azureedge.net/applescrusher/index.html after the second refresh: net::ERR_INSUFFICIENT_RESOURCES. Tested on a Samsung S8 (Android 7.0) with Chrome 69. |
With lodash you can do something like this. I had to add a timeout between my batched requests though. fetch('/reporter/current_user')
.then(response => response.json())
.then(files => _.chunk(files, 100).map(cache.addAll))
.catch(error => console.log(`Error caching file: ${error}`)); |
I got same issue in vue js |
omg you are the MAN |
Hi, I got the same error running a parallel Cypress regression within a docker image, but in my local (out of docker) I'm able to run more parallel regression without any issue. Do you know what can I tweak in my docker file to solve the resources issue? |
..> })
Great approach and solution! May I ask, if there is a purpose for the file hashes? Did not found any evaluation in your service-worker. Thx 👍 Edit; I guess; new version of file -> means new hash -> generates new file name for the cache-storage -> which ensures.. ? |
Hi,
I have a webapp project with many pictures (average 10000, 5Go) and I would like to have a offline capacity at start. So, I initialize my staticFileGlobs with all pictures.
It works fine with 5000 pictures but with 10000, I have 5000 errors like that :
GET http://localhost:8080/index.html?sw-precache=0c563970c0e2bcb7d00c6c6accbee96f net::ERR_INSUFFICIENT_RESOURCES
I use chrome/OSX
Thanks
The text was updated successfully, but these errors were encountered: