-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sending sentry data through a serverless function #4704
Comments
Hey thanks for writing in! The idea of a proxy is really interesting, and it can maybe come in the form of a custom transport. We unfortunately can't use the I'm going to backlog this for now, with the note that it's blocked by #4660, but PRs are welcome for anyone who wants to experiment with this! As an additional note. We are currently working on introducing an AWS Lambda extensions API for Sentry, which will provide these same off-process advantages as the proxy serverless function. We are also working with Vercel so that this will be available on the Vercel serverless platform. We'll provide more updates when that is closer to release! |
Thank you @AbhiPrasad. It sounds like AWS Lambda extensions API solves this problem "the right way". I'd definitely love to have this if we can! |
Isn't the |
Possibly - not sure though. @lobsterkatie any context you have here? |
This issue has gone three weeks without activity. In another week, I will close it. But! If you comment or otherwise update it, I will reset the clock, and if you label it "A weed is but an unloved flower." ― Ella Wheeler Wilcox 🥀 |
Problem Statement
A popular strategy for server-side logging is to:
A. Compute the normal request
B. Return the result to the client
C. On the same process, take the telemetry information collected in (A) and send it to the APM (e.g. Sentry).
Because (C) requires an async call to another (possibly distant) server, it can be slow, perhaps even slower than (A). Also, (C) must start after (A) completes as (C) describes the performance of (A).
Ideally, we would execute (B) before beginning (C). This can be done in Express because
res.send
is just an ordinary command that terminates the communication stream with the client but not the express handler (Stackoverflow). However, this cannot be accomplished in Vercel because it does not support these "fire-and-forget" background tasks:Therefore, in a Vercel + Sentry stack, (B) cannot happen until after (C), creating a significant and unnecessary performance penalty. Indeed, the sentry nextjs package has to monkeypatch wrap nextjs's
.end
to wait until the sentry analytics have returned before returning to the client.Solution Brainstorm
Proxy Function:
I propose that sentry/nextjs be architected as four steps:
A. Compute the normal request
B. Return the result to the client
C. On the same process, take the telemetry information collected in (A) and send it to a next serverless function (proxy function)
D. The nextjs serverless function sends the data to to APM (e.g. Sentry).
While (B) must occur after (C), (C) is now likely much faster. Indeed, it is likely that it will be sent sent to a function on the same computer, certainly in the same data center. The slow step of crossing to a potentially distant server (D), will no longer block (B), thus improving performance. The new proxy function can be very simple (e.g. takes the body and headers and forwards them to Sentry).
This solution may be useful.
Send and don't wait for acknowledgement
In Step (C) (either with with or without the proxy function), use
net.Socket
to send the data without waiting for Sentry to acknowledge the result (see this). Instead of completing the fullhttp
protocol, just send the header and body and stop listening for a reply. For many analytics applications, "best-effort" may be preferable if there is an accompanying speed boost.The text was updated successfully, but these errors were encountered: