Does Langfuse tracing work in Vercel edge functions? #4389
Replies: 2 comments
-
Hi @holdenmatt! Could you please enable debug logs and share whether you see any logs / spans exported in your edge runtime? Also, could you please add a call to the new LangfuseExporter({ debug: true }) |
Beta Was this translation helpful? Give feedback.
-
I'm running into the same issue. I've enabled debug logs in my LangfuseExporter and I can see I'm using
in my logs yet nothing appears in my dashboard. I've also tried using the Exact same setup works locally. EDIT: For me it's not working in the node.js runtime either on Vercel :( |
Beta Was this translation helpful? Give feedback.
-
Has anyone gotten Langfuse working in Vercel "edge" functions?
I have a nextjs app hosted on Vercel using the "ai" sdk. I followed the guide here to setup Langfuse tracing using OpenTelemetry: https://langfuse.com/docs/integrations/vercel-ai-sdk
I see traces from my dev environment but none from my production app. After debugging a while, I disabled the "edge" runtime in my api route and production traces finally appeared.
If I use "edge", I don't see any issues in Vercel build or runtime logs, traces just never appear in Langfuse.
Is this expected?
Beta Was this translation helpful? Give feedback.
All reactions