Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add async HTTP context #18

Merged
merged 1 commit into from
May 28, 2021
Merged

feat: add async HTTP context #18

merged 1 commit into from
May 28, 2021

Conversation

targos
Copy link
Member

@targos targos commented Jun 30, 2020

@RomainLanz @thetutlage

This is very much WIP but I open it early to discuss the API and approach taken.

@targos targos marked this pull request as draft June 30, 2020 10:58
@targos
Copy link
Member Author

targos commented Jun 30, 2020

The CI fails on Node.js 12.0.0 because AsyncLocalStorage is only available since v12.17.0.

@RomainLanz
Copy link
Member

RomainLanz commented Jun 30, 2020

Here's some benchmarks results to show the performance impact we have by activating the AsyncLocalStorage.

The first benchmark is a simple Hello World.

Without the AsyncLocalStorage

PS D:\Git\targos\adonis-experiments> autocannon http://localhost:3333
Running 10s test @ http://localhost:3333
10 connections

┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬──────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max      │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼──────────┤
│ Latency │ 1 ms │ 1 ms │ 2 ms  │ 3 ms │ 1.07 ms │ 0.56 ms │ 26.52 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴──────────┘
┌───────────┬────────┬────────┬─────────┬─────────┬─────────┬─────────┬────────┐
│ Stat      │ 1%     │ 2.5%   │ 50%     │ 97.5%   │ Avg     │ Stdev   │ Min    │
├───────────┼────────┼────────┼─────────┼─────────┼─────────┼─────────┼────────┤
│ Req/Sec   │ 4005   │ 4005   │ 8027    │ 8247    │ 7575.91 │ 1167.24 │ 4004   │
├───────────┼────────┼────────┼─────────┼─────────┼─────────┼─────────┼────────┤
│ Bytes/Sec │ 665 kB │ 665 kB │ 1.33 MB │ 1.37 MB │ 1.26 MB │ 194 kB  │ 665 kB │
└───────────┴────────┴────────┴─────────┴─────────┴─────────┴─────────┴────────┘

Req/Bytes counts sampled once per second.

83k requests in 11.06s, 13.8 MB read

With the AsyncLocalStorage

PS D:\Git\targos\adonis-experiments> autocannon http://localhost:3333
Running 10s test @ http://localhost:3333
10 connections

┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬──────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max      │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼──────────┤
│ Latency │ 1 ms │ 1 ms │ 3 ms  │ 4 ms │ 1.17 ms │ 0.65 ms │ 28.05 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴──────────┘
┌───────────┬────────┬────────┬─────────┬─────────┬─────────┬────────┬────────┐
│ Stat      │ 1%     │ 2.5%   │ 50%     │ 97.5%   │ Avg     │ Stdev  │ Min    │
├───────────┼────────┼────────┼─────────┼─────────┼─────────┼────────┼────────┤
│ Req/Sec   │ 3017   │ 3017   │ 6163    │ 6219    │ 5760.46 │ 904.61 │ 3017   │
├───────────┼────────┼────────┼─────────┼─────────┼─────────┼────────┼────────┤
│ Bytes/Sec │ 556 kB │ 556 kB │ 1.13 MB │ 1.14 MB │ 1.06 MB │ 166 kB │ 555 kB │
└───────────┴────────┴────────┴─────────┴─────────┴─────────┴────────┴────────┘

Req/Bytes counts sampled once per second.

63k requests in 11.06s, 11.7 MB read

We can see with that we loosing ~20k req in 10s.


In a real application, we may have many await statements that slow down the AsyncLocalStorage.
Here's another benchmark with many await.

Route.get('/', async () => {
  await asyncFn();
  await asyncFn();
  await asyncFn();
  await asyncFn();
  await asyncFn();
  await asyncFn();
  await asyncFn();
  await asyncFn();
  await asyncFn();
  await asyncFn();
  return { hello: AsyncHttpContext.getContext()?.request.id() || 'NOTHING' }
})

async function asyncFn() {
  await new Promise((resolve) => setImmediate(resolve));
}

Without the AsyncLocalStorage

┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬──────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max      │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼──────────┤
│ Latency │ 1 ms │ 1 ms │ 2 ms  │ 3 ms │ 1.06 ms │ 0.37 ms │ 17.28 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴──────────┘
┌───────────┬────────┬────────┬─────────┬─────────┬─────────┬─────────┬────────┐
│ Stat      │ 1%     │ 2.5%   │ 50%     │ 97.5%   │ Avg     │ Stdev   │ Min    │
├───────────┼────────┼────────┼─────────┼─────────┼─────────┼─────────┼────────┤
│ Req/Sec   │ 3881   │ 3881   │ 7807    │ 7975    │ 7367    │ 1131.91 │ 3880   │
├───────────┼────────┼────────┼─────────┼─────────┼─────────┼─────────┼────────┤
│ Bytes/Sec │ 563 kB │ 563 kB │ 1.13 MB │ 1.16 MB │ 1.07 MB │ 164 kB  │ 563 kB │
└───────────┴────────┴────────┴─────────┴─────────┴─────────┴─────────┴────────┘

Req/Bytes counts sampled once per second.

81k requests in 11.06s, 11.7 MB read

With the AsyncLocalStorage

┌─────────┬──────┬──────┬───────┬──────┬─────────┬─────────┬──────────┐
│ Stat    │ 2.5% │ 50%  │ 97.5% │ 99%  │ Avg     │ Stdev   │ Max      │
├─────────┼──────┼──────┼───────┼──────┼─────────┼─────────┼──────────┤
│ Latency │ 2 ms │ 2 ms │ 4 ms  │ 5 ms │ 2.17 ms │ 0.75 ms │ 21.08 ms │
└─────────┴──────┴──────┴───────┴──────┴─────────┴─────────┴──────────┘
┌───────────┬────────┬────────┬────────┬────────┬─────────┬────────┬────────┐
│ Stat      │ 1%     │ 2.5%   │ 50%    │ 97.5%  │ Avg     │ Stdev  │ Min    │
├───────────┼────────┼────────┼────────┼────────┼─────────┼────────┼────────┤
│ Req/Sec   │ 2159   │ 2159   │ 4499   │ 4651   │ 4114.11 │ 764.39 │ 2159   │
├───────────┼────────┼────────┼────────┼────────┼─────────┼────────┼────────┤
│ Bytes/Sec │ 313 kB │ 313 kB │ 652 kB │ 674 kB │ 596 kB  │ 111 kB │ 313 kB │
└───────────┴────────┴────────┴────────┴────────┴─────────┴────────┴────────┘

Req/Bytes counts sampled once per second.

41k requests in 10.06s, 5.96 MB read

We can see that the we are loosing way more req/sec with multiple await in the handler.
Notice than this 50% drop is only 1ms.


@thetutlage
Copy link
Member

Hey @targos, thanks for the PR, this is going to be a helpful addition.

@RomainLanz @targos Yes, the performance hits are quite visible and if I am not wrong, then with every async/await call, the performance degrades linearly?

Also I think, the API is currently new and we will see performance improvements in the future right? Or is it something that will never happen, because of the design of async local storage?

@thetutlage
Copy link
Member

@targos
Copy link
Member Author

targos commented Jul 3, 2020

if I am not wrong, then with every async/await call, the performance degrades linearly?

I have not checked if it's linear, but it seems so, yeah. The reason is that for every await call, the spec requires multiple promises to be created by the engine. Node.js has to go through async hooks init callaback for each of them to do the context propagation. This affects async hooks in general, not only AsyncLocalStorage.

Also I think, the API is currently new and we will see performance improvements in the future right? Or is it something that will never happen, because of the design of async local storage?

I have looked a bit at the implementation and it seems that performance improvements are possible in the future.

/cc @vdeturckheim do you think I've implemented it correctly on the Adonis side?

Copy link

@vdeturckheim vdeturckheim left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From what I understand implementation LGTM

@vdeturckheim
Copy link

It might also be interesting to add a handler at process level for unhandled exceptions - with the context, you should be able to trigger the framework's error logic from there.

@thetutlage
Copy link
Member

@targos

So what you suggest, on how should we roll it?

For me the biggest advantage is to use the logger and profiler from the current HTTP request when performing certain actions like SQL queries. But not sure, if it is worth the performance degrade that comes with it

@thetutlage thetutlage added the Status: Blocked The work on the issue or the PR is blocked. Check comments for reasoning label Jul 24, 2020
src/AsyncHttpContext/index.ts Outdated Show resolved Hide resolved
src/Server/index.ts Outdated Show resolved Hide resolved
@puzpuzpuz
Copy link

puzpuzpuz commented Jul 24, 2020

The only improvement I can think of from the implementation perspective is wrappers around http's EventEmitters (req/res), as they might emit events in a different context. Although, I'm not use with Adonis listens to those events and/or allows users to listen for them.

Here is a simple example: https://github.com/puzpuzpuz/cls-rtracer/blob/master/src/util.js#L8

@puzpuzpuz
Copy link

I've checked the benchmark and don't see any problems with it or any potential improvements. Indeed, when HTTP server does almost nothing, ALS (or async_hooks to be more precise) adds a significant overhead. Although, with a different benchmark which does something to emulate a real-world application workload, like this one, the overhead is much smaller. So, my take is that if the framework documentation would cleanly warn users about the overhead (which should be measured individually) this feature could be added, as it's certainly valuable.

Also I think, the API is currently new and we will see performance improvements in the future right? Or is it something that will never happen, because of the design of async local storage?

AFAIK all low-hanging fruits in terms of ALS-targeted optimizations in async_hooks are already implemented, so I wouldn't expect any major performance improvements in the nearest future. If you're curious about those optimizations here is a short list of related core PRs (it's certainly incomplete, but that should be enough to understand what's going on):

One of the main remaining problems related with the overhead (especially when async/await syntax is in use) is async_hooks to native Promise integration. Node.js uses Isolate::SetPromiseHook hook exposed by V8. This hook is called for every promise on the native side of Node.js when an important life cycle (created/before/after/resolved) event happens. This means that even in case of the fast-path introduced by 32891 we have a JS-to-native-code switch for each hook invocation. That seems to be the main cost associated with promise tracking in async_hooks. If the hook was made on JS side, this could help, but that requires significant changes in V8 and node.

What's written above is my vision and other collaborators (who are certainly more knowledgeable in the topic) may have different opinion.

@thetutlage
Copy link
Member

@puzpuzpuz Thanks a ton for your sharing all these insights, it's really helpful. I will start looking into getting this implemented and then try it on a real project to see the actual overhead.

@puzpuzpuz
Copy link

@thetutlage no problem at all. I'll be waiting for your opinion once the experiments are done.

@stale
Copy link

stale bot commented Sep 24, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the Status: Abandoned Dropped and not into consideration label Sep 24, 2020
@targos
Copy link
Member Author

targos commented Sep 24, 2020

I'm going to update this eventually.

@stale stale bot removed the Status: Abandoned Dropped and not into consideration label Sep 24, 2020
@stale
Copy link

stale bot commented Nov 24, 2020

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the Status: Abandoned Dropped and not into consideration label Nov 24, 2020
@targos
Copy link
Member Author

targos commented Nov 24, 2020

This is not abandoned. @thetutlage is there a way to prevent the stale bot from marking this PR?

@stale stale bot removed the Status: Abandoned Dropped and not into consideration label Nov 24, 2020
@thetutlage
Copy link
Member

Updated config to exempt the Status: Blocked label

@thetutlage
Copy link
Member

I followed this thread nodejs/node#34493 and especially the comment by mcollina nodejs/node#34493 (comment) that micro benchmarks are may not be the best way to find the async hooks overhead.

Should we give it a try once again?

I am thinking to finalize the feature and put it behind a flag. I have a couple of apps handling real world traffic, I will use it there and see the overhead there?

@puzpuzpuz
Copy link

Should we give it a try once again?

I don't think you'll see significant improvements in terms of the overhead, but once nodejs/node#36394 lands, the improvement should be very significant. So, in future versions of node async_hooks + native promises integration (which was the main factor) is very likely to improve a lot.

@stale
Copy link

stale bot commented Feb 16, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the Status: Abandoned Dropped and not into consideration label Feb 16, 2021
@targos
Copy link
Member Author

targos commented Feb 16, 2021

Updated config to exempt the Status: Blocked label

Seems that didn't work

@stale stale bot removed the Status: Abandoned Dropped and not into consideration label Feb 16, 2021
@stale
Copy link

stale bot commented Apr 17, 2021

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the Status: Abandoned Dropped and not into consideration label Apr 17, 2021
@vdeturckheim
Copy link

I know that there is some ongoing work to make this faster but I don't know exactly where that is. I will try to get some data next week.

@thetutlage
Copy link
Member

Thanks @vdeturckheim

@stale stale bot removed the Status: Abandoned Dropped and not into consideration label Apr 19, 2021
@fareeda0
Copy link

fareeda0 commented Apr 20, 2021

FYI just saw that the underlying v8 changes for node/36394 were just merged in.

Thanks to all contributors for working on this!

@thetutlage I had a question around how async HttpContext might work with v5.

Take Lucid for example. Would async context work with emitter, i.e. having access to the request ctx when working with the db::query event?

This would allow for much deeper and correlated logging within our applications, and - at least for our business, is the most promising use of this feature.

@thetutlage
Copy link
Member

That's great. I see that the PR does improves the performance a lot.

Take Lucid for example. Would async context work with emitter, i.e. having access to the request ctx when working with the db::query event?

Yes, it will. Infact, AdonisJS uses a profiler under the hood, that collects more metrics. However, we never exposed it or documented it. Coz, the real benefit of profiler comes into the picture after using the AsyncHooks.

@targos
Copy link
Member Author

targos commented Apr 21, 2021

I'm going to update this PR soon.

@RomainLanz
Copy link
Member

RomainLanz commented May 20, 2021

Seems that some updates regarding performance have been done in latest Node.js release.

https://twitter.com/bengl/status/1395229386480234500

@fareeda0
Copy link

Seems that some updates regarding performance have been done in latest Node.js release.

https://twitter.com/bengl/status/1395229386480234500

Yep, just saw the release PR (#38719) for 16.2.0.

Really excited for this one! 😄

@RomainLanz RomainLanz added the Status: On Hold The work was started earlier, but is on hold now. Check comments for reasoning label May 20, 2021
@targos
Copy link
Member Author

targos commented May 28, 2021

I updated the code. Some questions before I make the PR ready and write tests:

Is the API OK, from a user's perspective?

It looks like this:

import AsyncHttpContext from '@ioc:Adonis/Core/AsyncHttpContext'
import Route from '@ioc:Adonis/Core/Route'

import { setTimeout } from 'timers/promises'

AsyncHttpContext.getContext() // Returns null (because it's called outside of a request)
AsyncHttpContext.getContextOrFail() // Throws (same reason)

Route.get('/test', async () {
  await doSomething()
  return 'ok'
})

function doSomething() {
  await setTimeout(1000)
  const ctx = AsyncHttpContext.getContextOrFail()
  console.log(ctx.request.url())
}

Should it be opt-in?

I will do some new testing with Node.js 16.2.0 as the overhead should be much smaller than before.
That said, v14.x currently doesn't contain the perf boost (it will probably backported in the next minor LTS release in August). I can make it opt-in as a start, but as soon as Adonis core wants to use AsyncLocalStorage internally, the performance penalty will be paid and there would be no reason to disable the API for the user.

@thetutlage
Copy link
Member

Thanks for updating the PR :)

So, there are a couple of ways to opt into it. First is

Enable it using a flag and then the internals of the framework will rely on the flag to access the Async context. This way, someone can just opt into this for tracing. This also opens up the possibility to have something like Laravel debugbar for development.

Another is we just make this the default way to even write our apps. Meaning, the user land code uses Async context for developing features. If the performance impact is not too big, then I will opt for this. Coz it then makes everything simple, the guides and tutorials all will use the same syntax/API

@targos targos marked this pull request as ready for review May 28, 2021 13:45
@targos targos changed the title feat: add AsyncHttpContext (wip) feat: add async HTTP context May 28, 2021
@thetutlage
Copy link
Member

Let's get it in :)

@thetutlage thetutlage merged commit a7e6d0d into adonisjs:develop May 28, 2021
@thetutlage thetutlage removed the Status: On Hold The work was started earlier, but is on hold now. Check comments for reasoning label May 28, 2021
@targos targos deleted the als branch May 28, 2021 14:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Status: Blocked The work on the issue or the PR is blocked. Check comments for reasoning
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants