-
Notifications
You must be signed in to change notification settings - Fork 322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
π Bug Report β TypeError: memory limit has been exceeded with inline data:
URL
#2998
Comments
I'm unable to reproduce any failure here with workerd given the example included. The error that you mention ("Parser error: The memory limit has been exceeded") is an error that HTMLRewriter would throw (it comes from lol-html). Your example is not using HTMLRewriter, however so I'm a bit confused. Can you please clarify? Also, are you seeing this error with a production worker (e.g. wrangler deploy, run the worker) or in local dev (e.g. wrangler dev), etc. |
This has me incredibly confused as well, there is indeed no instance of HTMLRewriter. What I posted as the worker script is what I am testing with, in a fresh/empty directory too, and running the The exception is from the html-rewriter code though:
A couple more pieces:
Also, I notice that the first request will succeed with a bad response (no error but only 25 bytes sent) and the second request will fail with a bad response (error raised and only 25 bytes sent). |
I can only follow along loosely with C++, but notice there is implicit parsing of data URLs outside the HTML rewriter: workerd/src/workerd/api/data-url.c++ Line 17 in a9cf631
It seems like there is some processing of data URLs as sub requests inside a |
The |
Yeah I saw that parsing of the When you say you aren't able to reproduce this bug, what do you notice on the response? Are you getting 25 bytes back, or the full 10MB? Would a docker compose or something help verify this bug? I could stumble through rebuilding |
Keeping in mind that I'm testing with workerd only (not |
I don't know enough about using workerd directly, if there is something I can run locally and confirm what you're seeing, let me know. Is it possible this is specific to wrangler/workers-sdk otherwise? |
It's possible since in that path something may be injecting polyfills and maybe HTMLRewriter use into the worker. Take a look through the compiled source that the CLI generates and see if you can spot anywhere HTMLRewriter may be getting added. |
Are you describing generating this with Wrangler? With something like:
When I do that, the bundled file looks like this: // test.js
var test_default = {
async fetch(request, env, ctx) {
const response = await fetch("http://localhost:9000/10mb.html");
return response;
}
};
export {
test_default as default
};
//# sourceMappingURL=test.js.map |
I don't know exactly what to generate here, so here's a docker compose with the minimal reproduction. This is just the example above, but isolated via containers to prove that there is indeed no HTMLWriter defined at any point in the worker. https://github.com/agjohnson/wrangler-large-data-url Let me know if any part of that isn't clear. If this seems like a wrangler issue, could we transfer the issue? |
What is happening
When serving an HTML file with a large (>10MB) inline
data:
URL resource, even a basic worker will halt the response and respond with:TypeError: Parser error: The memory limit has been exceeded.
I thought this was simply the response size, but this is not the case.
text/plain
, the file will send through the worker/dev/random
, the file will send through the workerSo, it does seem to me to be very specific to inline data URLs.
Does this imply the worker
fetch()
is somehow inspecting the data URL during streaming? This seems odd, but if so, is there a missing option to disable this?Reproducing
Example data URL HTML generator:
Generate the HTML file and serve it:
% ./gen.sh > 10mb.html % python -m http.server 9000
And the worker:
Run and try to fetch:
% wrangler dev test.js % curl -o - http://localhost:8787 > /dev/null % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 25 0 25 0 0 46 0 --:--:-- --:--:-- --:--:-- 46
Notice only 25 bytes sent, which is the start of the
img
elementThe text was updated successfully, but these errors were encountered: