Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sending sentry data through a serverless function #4704

Closed
tianhuil opened this issue Mar 11, 2022 · 5 comments
Closed

Sending sentry data through a serverless function #4704

tianhuil opened this issue Mar 11, 2022 · 5 comments
Labels
Package: nextjs Issues related to the Sentry Nextjs SDK Type: Improvement

Comments

@tianhuil
Copy link

Problem Statement

A popular strategy for server-side logging is to:

A. Compute the normal request
B. Return the result to the client
C. On the same process, take the telemetry information collected in (A) and send it to the APM (e.g. Sentry).

Because (C) requires an async call to another (possibly distant) server, it can be slow, perhaps even slower than (A). Also, (C) must start after (A) completes as (C) describes the performance of (A).

Ideally, we would execute (B) before beginning (C). This can be done in Express because res.send is just an ordinary command that terminates the communication stream with the client but not the express handler (Stackoverflow). However, this cannot be accomplished in Vercel because it does not support these "fire-and-forget" background tasks:

Vercel does not support streaming responses from Serverless Functions due to an upstream limitation from AWS.
This means that background tasks, also known as "fire-and-forget" is not supported. Once the Serverless Function returns the response payload, it stops processing including any pending background tasks.

Therefore, in a Vercel + Sentry stack, (B) cannot happen until after (C), creating a significant and unnecessary performance penalty. Indeed, the sentry nextjs package has to monkeypatch wrap nextjs's .end to wait until the sentry analytics have returned before returning to the client.

Solution Brainstorm

Proxy Function:

I propose that sentry/nextjs be architected as four steps:

A. Compute the normal request
B. Return the result to the client
C. On the same process, take the telemetry information collected in (A) and send it to a next serverless function (proxy function)
D. The nextjs serverless function sends the data to to APM (e.g. Sentry).

While (B) must occur after (C), (C) is now likely much faster. Indeed, it is likely that it will be sent sent to a function on the same computer, certainly in the same data center. The slow step of crossing to a potentially distant server (D), will no longer block (B), thus improving performance. The new proxy function can be very simple (e.g. takes the body and headers and forwards them to Sentry).

This solution may be useful.

Send and don't wait for acknowledgement

In Step (C) (either with with or without the proxy function), use net.Socket to send the data without waiting for Sentry to acknowledge the result (see this). Instead of completing the full http protocol, just send the header and body and stop listening for a reply. For many analytics applications, "best-effort" may be preferable if there is an accompanying speed boost.

@AbhiPrasad
Copy link
Member

Hey thanks for writing in!

The idea of a proxy is really interesting, and it can maybe come in the form of a custom transport. We unfortunately can't use the Send and don't wait for acknowledgement option here, because we need the response for Sentry for rate-limiting reasons (need to watch for 429s).

I'm going to backlog this for now, with the note that it's blocked by #4660, but PRs are welcome for anyone who wants to experiment with this!

As an additional note. We are currently working on introducing an AWS Lambda extensions API for Sentry, which will provide these same off-process advantages as the proxy serverless function. We are also working with Vercel so that this will be available on the Vercel serverless platform. We'll provide more updates when that is closer to release!

@AbhiPrasad AbhiPrasad added Status: Backlog Package: nextjs Issues related to the Sentry Nextjs SDK and removed Status: Untriaged labels Mar 14, 2022
@tianhuil
Copy link
Author

Thank you @AbhiPrasad. It sounds like AWS Lambda extensions API solves this problem "the right way". I'd definitely love to have this if we can!

@mikestopcontinues
Copy link

Isn't the waitUntil() API sufficient to solve this?
https://nextjs.org/docs/api-reference/next/server#nextfetchevent

@AbhiPrasad
Copy link
Member

Possibly - not sure though. @lobsterkatie any context you have here?

@github-actions
Copy link
Contributor

This issue has gone three weeks without activity. In another week, I will close it.

But! If you comment or otherwise update it, I will reset the clock, and if you label it Status: Backlog or Status: In Progress, I will leave it alone ... forever!


"A weed is but an unloved flower." ― Ella Wheeler Wilcox 🥀

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Package: nextjs Issues related to the Sentry Nextjs SDK Type: Improvement
Projects
None yet
Development

No branches or pull requests

4 participants