-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WIP http_parser: don't reuse HTTPParser #24330
Conversation
As discussed in nodejs/diagnostics#248 and nodejs#21313, reusing HTTPParser resource is a blocker for landing `require('async_hooks').currentAsyncResource()`, since reusing an async resource interanlly would make it infeasible to use them reliably in WeakMaps. Two suggestions came up: have a wrapper around the HTTPParser which we would use as the async resource, or stop reusing HTTPParser in our HTTP server/client code. This commit implements the latter, since we have a better GC now and reusing HTTPParser might make sense anymore. This also will avoid one extra JS->C++ call in some cases, which should improve performance for these cases.
c642522
to
73e2db6
Compare
Regressions on
I'll see if there's anything else we can optimize to compensate for these regressions. Also, I'm not sure if we can break the HTTPParser API. It's not documented, and this PR removes the @mcollina @Flarna @AndreasMadsen PTAL a let me know what you think. Full HTTP Benchmark Results
(edit by @AndreasMadsen: fixed formatting in |
Yeah, Increased Benchmark CI to 100 runs. There are quite a few comparisons, so I would like a good low confidence threshold (alpha) if possible. https://ci.nodejs.org/view/Node.js%20benchmark/job/benchmark-node-micro-benchmarks/274/ It will take quite a while, so feel free to stop it if there is something else that is more important. |
Picking only the most significant results:
Tbh, I am not a fan of doing this – I was actually thinking about introducing similar mechanisms for other parts of Node, because there is noticable overhead associated with creating C++ objects. If we want less of a performance impact, we probably need to address that in V8 in some way before going this route… |
@mmarchini As you pinged me to comment on this. I'm sorry but I feel not qualified to rate the results of these benchmarks. Clearly more overhead is bad but I can't tell the impact of these micro benchmarks to real world apps. I'm sure there are others in node community with more experience in this then me. @addaleax Just for my personal interest. Do you have some hint or link to a more in depth info what exactly is causing the overhead with C++ objects? e.g. is it the need that GC has to call a finalizer or the crossing of JS/C++ border? I googled around but haven't found details. |
@Flarna It’s the crossing of the JS/C++ border – there’s a lot of overhead in both directions… I didn’t see GC pop up as a significant slowdown in benchmarks, actually. And I wouldn’t expect finalizers to come with a lot of overhead, as that doesn’t require V8 to change engine state in some way… |
The goal here is to make I'm open to suggestions :)
I think it's a good performance optimization as long as it doesn't hurt Node.js debugability and observability.
That would help I think. @hashseed @bmeurer any thoughts on this? |
I mean, the |
Sounds good; using |
I like this idea, I'm trying it right now :) |
IncomingMessage works for Should we use ClientRequest for |
+1.
Il giorno mar 20 nov 2018 alle 03:06 Matheus Marchini <
notifications@github.com> ha scritto:
… IncomingMessage works for HTTPParser.REQUEST :D
Should we use ClientRequest for HTTPParser.RESPONSE? (ref:
https://github.com/nodejs/node/blob/master/lib/_http_client.js#L631-L635)
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#24330 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AADL44CPpfqGHEnSfEynqjwrrO897J_dks5uw2O8gaJpZM4YaziK>
.
|
Using I think we should not use the same resource type (e.g. |
@addaleax Hmm, but why is it slower then. The initial approach doesn't increase the number of native calls just changes from |
Regarding that and previusely work that have been done using JavaScript. Would it not be possible to compile |
Closing in favor of #25094 |
Change resource being used, previously HTTParser was being reused. We are now using IncomingMessage and ClientRequest objects. The goal here is to make the async resource unique for each async operatio Refs: nodejs#24330 Refs: nodejs/diagnostics#248 Refs: nodejs#21313 Co-authored-by: Matheus Marchini <mat@mmarchini.me>
Change resource being used, previously HTTParser was being reused. We are now using IncomingMessage and ClientRequest objects. The goal here is to make the async resource unique for each async operatio Refs: #24330 Refs: nodejs/diagnostics#248 Refs: #21313 Co-authored-by: Matheus Marchini <mat@mmarchini.me> PR-URL: #25094 Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Benedikt Meurer <benedikt.meurer@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net>
As discussed in nodejs/diagnostics#248 and
#21313, reusing HTTPParser resource
is a blocker for landing
require('async_hooks').currentAsyncResource()
, since reusing an asyncresource interanlly would make it infeasible to use them reliably in
WeakMaps. Two suggestions came up: have a wrapper around the HTTPParser
which we would use as the async resource, or stop reusing HTTPParser in
our HTTP server/client code.
This commit implements the latter, since we have a better GC now and
reusing HTTPParser might make sense anymore. This also will avoid one
extra JS->C++ call in some cases, which should improve performance for
these cases.
Checklist
make -j4 test
(UNIX), orvcbuild test
(Windows) passes