-
Notifications
You must be signed in to change notification settings - Fork 29.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible memory leak in http2 #21332
Comments
@nodejs/http2 |
@ryzokuken Looked into it, but couldn’t find anything yet… It’s not #21336, that just popped up while investigating. :) |
I wished it was related, and we had a fix ready :) |
@ryzokuken It’s an increase in RSS but not in heap size … it’s not what we’d expect from a typical leak in JS Also, as I understand it, this only happens with |
@addaleax If you were referring to |
I'll investigate over the weekend if this isn't fixed by then. |
I couldn't reproduce the result under I expect the inconsitencies in the observed RSS memory usage to be caused by the memory allocator here. |
Moreover, calling My test code (client only, server is the same): https://gist.github.com/ChALkeR/aa392f5bb957279f0527e7f74e3b72da I would say that something unoptimal is going on with memory allocation and garbage collection here, but this doesn't look like a memory leak. I.e. the memory is freed correctly, jut RSS doesn't decrease completely because of how memory allocator works. |
@addaleax could you point me to a nightly build that contains these changes? (i don't really have much experience with compiling node nor have the environment setup for such). I can test it in production to see if it will behave differently, i can definitely see #21336 being related as there are some "short" custom headers used. Considering what @ChALkeR said, i am less sure about this being a leak in http2, if that's the case, i am really sorry for pointing you in the wrong direction :( What initially put me of the "this is just how GC behaves" is the fact that observed RSS growth appears to completely ignore |
Note: I am not entirely sure if my analysis above is correct, let's wait for what @addaleax thinks on this. I might have missed something. |
@addaleax Since #21336 made it to the 10.5.0 release i have been running tests on several of the production systems and its looking good. I will make sure to put #21373 and the new custom memory allocator #21374 through testing once they make it to the next release. When it comes to my original issue, this appears to be fully resolved in 10.5.0, yay \o/ I will keep this ticket open for now in case there is anything more to add here. |
@cTn-dev I might've misread but it seems like the issues have been resolved? If so, I will close this out but do feel free to reopen if I misunderstood or if you find more issues. I'm strictly doing this so it's easier for everyone to identify which issues still exist and to focus on fixing them. |
Hi, this is my first bug report for node, hopefully i get it right!
TLDR: http2session.request appears to be leaking memory outside of javascript (heap is clean, resident set size keeps growing).
I am attaching a rather crude test code which demonstrates the issue and is able to repro it every time.
The zip file attached contains client.js, server.js and "keys" folder containing self signed SSL keys.
http2test.zip
node server.js
node --expose-gc client.js
Depending on the performance of your machine, you should see the "test output" in < 15 seconds or so.
What happens
process.memoryUsage()
for later comparisonprocess.memoryUsage
and calculate the delta for rss, heapTotal and heapUsedYou should get an output similar to this
Deltas: 0 KB Heap Total, -52 KB Heap Used, 10436 KB RSS
Feel free to play with the amount of requests you make, obviously higher number takes longer to execute but leaves much higher rss.
Here is a graph of rss, heap total and heap used over 24h on a system that does ~30k> requests per day, it has been running for several days (hence the larger rss).
Sorry i wasn't able to narrow it down any further (took me over a month to trace the issue this far in my system as it kept leaking rather slowly and doing heap snapshot comparison never revealed anything).
The text was updated successfully, but these errors were encountered: