Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐛 BUG: "warning: Not symbolizing stack traces because $LLVM_SYMBOLIZER is not set" #3631

Closed
aaronadamsCA opened this issue Jul 19, 2023 · 20 comments · Fixed by #4341
Closed
Assignees
Labels
bug Something that isn't working

Comments

@aaronadamsCA
Copy link

aaronadamsCA commented Jul 19, 2023

Which Cloudflare product(s) does this pertain to?

Wrangler core

What version(s) of the tool(s) are you using?

3.2.0

What version of Node are you using?

No response

What operating system are you using?

Ubuntu 22.04.2 LTS

Describe the Bug

Anytime the application reports an error, the first error is prepended with this:

workerd/util/symbolizer.c++:99: warning: Not symbolizing stack traces because $LLVM_SYMBOLIZER is not set. To symbolize stack traces, set $LLVM_SYMBOLIZER to the location of the llvm-symbolizer binary. When running tests under bazel, use `--test_env=LLVM_SYMBOLIZER=<path>`.

Stack traces that follow are squished and hard to read.

This has also been reported in comments at the bottom of #3262 and #3457, and appears to affect multiple operating systems.

Please provide a link to a minimal reproduction

No response

Please provide any relevant error logs

No response

@aaronadamsCA aaronadamsCA added the bug Something that isn't working label Jul 19, 2023
@github-project-automation github-project-automation bot moved this to Untriaged in workers-sdk Jul 19, 2023
@juanpmarin
Copy link

Also happening in Fedora Linux 38

@gooftroop
Copy link

Also happening on macos ventura

@dagnelies
Copy link

dagnelies commented Jul 21, 2023

Here is how it was triggered for me.

Let it run using npx wrangler pages dev ...
Open the browser and invoke a url/function to trigger an uncaught exception in the code.
The terminal will show the error/exception and prompt for user input again (instead of tailing the output as usual).

As a result, when restarting the server, you get:

Compiling worker to "/tmp/functionsWorker-0.27552777603724454.mjs"...
✨ Compiled Worker successfully
 ⛅️ wrangler 3.3.0
------------------
wrangler dev now uses local mode by default, powered by 🔥 Miniflare and 👷 workerd.
To run an edge preview session for your Worker, use wrangler dev --remote
⎔ Starting local server...
[mf:wrn] The latest compatibility date supported by the installed Cloudflare Workers Runtime is "2023-07-17",
but you've requested "2023-07-21". Falling back to "2023-07-17"...
workerd/util/symbolizer.c++:99: warning: Not symbolizing stack traces because $LLVM_SYMBOLIZER is not set. To symbolize stack traces, set $LLVM_SYMBOLIZER to the location of the llvm-symbolizer binary. When running tests under bazel, use `--test_env=LLVM_SYMBOLIZER=<path>`.
*** Fatal uncaught kj::Exception: kj/async-io-unix.c++:945: failed: ::bind(sockfd, &addr.generic, addrlen): Address already in use; toString() = 0.0.0.0:8788
stack: /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@31cdfba /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@31cdd5f /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@31cbfc5 /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@168a085 /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@168aa00 /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@164afdd /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@164fd47 /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@164fac4 /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@164faac /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@32061de /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@3205ddd /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@32040c8 /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@3203e8a /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@1640293 /lib/x86_64-linux-gnu/libc.so.6@24082 /workspaces/app/node_modules/@cloudflare/workerd-linux-64/bin/workerd@164002d
✘ [ERROR] MiniflareCoreError [ERR_RUNTIME_FAILURE]: The Workers runtime failed to start. There is likely additional logging output above.

...because the address is still in use, but you're back in the terminal prompt instead of tailing output.

Browser side, you have: Error: Network connection lost.

@sksat

This comment was marked as duplicate.

@dagnelies
Copy link

Any update? Btw, it's easy to reproduce on github codespaces. Just make a wonky function that throws an exception.

@mrbbot Hi. Maybe it's related to the static build from #3262 ?

@huw
Copy link
Contributor

huw commented Aug 2, 2023

Installing llvm-symbolizer and supplying LLVM_SYMBOLIZER=$(which llvm-symbolizer) to the environment fixes the initial issue (it should be present on many installations, including GitHub Codespaces; otherwise this will depend on your distro). This is what the error suggests to do—did nobody here try this?

However, because the symbolizer’s arguments are version-dependent, this just raises a new issue:

llvm-symbolizer: Unknown command line argument '--relativenames'.  Try: '/usr/bin/llvm-symbolizer --help'

It seems like workerd has been compiled against a specific target there that’s not being bundled with it. I don’t think it should, but it would be nice to clean up the error output a bit, because that’s followed by a tonne of gibberish, especially if the error is just Network connection lost.

@kael
Copy link

kael commented Aug 2, 2023

it would be nice to clean up the error output a bit, because that’s followed by a tonne of gibberish, especially if the error is just Network connection lost.

+1

Yes please, logs are a bit hard to read with this error message.

@ghost
Copy link

ghost commented Aug 8, 2023

I used the example code from CF at https://developers.cloudflare.com/r2/examples/aws/aws-sdk-js-v3/

As long I use cloudflare it is working fine.
As soon as I replace the endpoint by AWS S3 or BackBlaze B2 this bug appears.

@Cherry
Copy link
Contributor

Cherry commented Aug 13, 2023

People keep hitting this in Discord, with errors like Address already in use; toString() = 0.0.0.0:8788.

After killing any other workerd processes, things being to work again for them. Can anything be done to improve the DX here?

@niconiahi
Copy link

niconiahi commented Aug 20, 2023

in my case, I'm getting the same error, just by starting the server. I have started my repo with create-remix using the Cloudflare Pages starter just a few days ago

also getting this error as well:

workerd/jsg/jsg.c++:136: error: took recursive isolate lock; kj::getStackTrace() = 10282cd9f 102ab5947 102ac8cf3 102ac8c77 1040bd7bb 102e2ced7 102e2dbf3 102d93eb3 102d940db 104090a3f 102ba9a33 104090d67 102baa51f 104090d67 102a957cf 1040926eb 104090d67 102a963c3 104091a13 104090a3f 102a96af7 104090d67 102a96c23 104091a13 104090a3f 102

this is the already spoken error I'm also getting

workerd/util/symbolizer.c++:99: warning: Not symbolizing stack traces because $LLVM_SYMBOLIZER is not set. To symbolize stack traces, set $LLVM_SYMBOLIZER to the location of the llvm-symbolizer binary. When running tests under bazel, use `--test_env=LLVM_SYMBOLIZER=<path>`.
Screenshot 2023-08-20 at 13 58 03

@richardscarrott
Copy link
Contributor

@niconiahi did you ever find a solution to this error? It's super frustrating not having source maps when developing locally. (We also are building a Remix app using the CF Pages starter).

@dagnelies
Copy link

dagnelies commented Sep 29, 2023

yup ...super annoying ...and no comment from the devs 👀 ...I don't know if you realize this, but this crashes the server extremely frequently for the majority (?) of people.

My way to regularly kill the server: kill $(lsof -t -i:8788)

@aaronadamsCA
Copy link
Author

It seems the root cause is this:

cloudflare/workerd#706 (comment)

Yeah our logging in workerd is awkward now. In production, we convert these log lines into Sentry errors, and we tend to do a lot of logging that's just meant to be informative for us as the developers of workerd. For someone using workerd in their own infrastructure, the logging is not really appropriate. We need to figure out what to do about this but it's probably a big project...

I can confirm that when I configure $LLVM_SYMBOLIZER, it only adds meaningless C++ traces to workerd error output, making it even less useful:

-workerd/util/symbolizer.c++:98: warning: Not symbolizing stack traces because $LLVM_SYMBOLIZER is not set. To symbolize stack traces, set $LLVM_SYMBOLIZER to the location of the llvm-symbolizer binary. When running tests under bazel, use `--test_env=LLVM_SYMBOLIZER=<path>`.
 workerd/jsg/jsg.c++:136: error: took recursive isolate lock; kj::getStackTrace() = /workspace/node_modules/.pnpm/@cloudflare+workerd-linux-64@1.20230922.0/node_modules/@cloudflare/workerd-linux-64/bin/workerd@...
+
+
+
+
+
+
+kj::_::HeapDisposer<workerd::Worker::Isolate::ResponseStreamWrapper>::disposeImpl(void*) const at ??:0:0
+
+
+kj::OneOf<kj::Own<kj::AsyncOutputStream, std::nullptr_t>, kj::Own<kj::GzipAsyncOutputStream, std::nullptr_t>, kj::Own<kj::BrotliAsyncOutputStream, std::nullptr_t>, workerd::api::(anonymous namespace)::EncodedAsyncOutputStream::Ended>::destroy() at ??:0:0
+
+
+kj::_::HeapDisposer<workerd::api::(anonymous namespace)::EncodedAsyncOutputStream>::disposeImpl(void*) const at ??:0:0
+
+workerd::api::ReadableStreamInternalController::pumpTo(workerd::jsg::Lock&, kj::Own<workerd::api::WritableStreamSink, std::nullptr_t>, bool)::Holder::~Holder() at ??:0:0
+kj::_::AttachmentPromiseNode<kj::Own<workerd::api::ReadableStreamInternalController::pumpTo(workerd::jsg::Lock&, kj::Own<workerd::api::WritableStreamSink, std::nullptr_t>, bool)::Holder, std::nullptr_t> >::destroy() at ??:0:0
+
+
+kj::_::AttachmentPromiseNodeBase::dropDependency() at ??:0:0
+kj::_::AttachmentPromiseNode<kj::_::Deferred<workerd::api::ServiceWorkerGlobalScope::request(kj::HttpMethod, kj::StringPtr, kj::HttpHeaders const&, kj::AsyncInputStream&, kj::HttpService::Response&, kj::Maybe<kj::StringPtr>, workerd::Worker::Lock&, kj::Maybe<workerd::api::ExportedHandler&>)::$_7> >::destroy() at ??:0:0
+
+kj::_::TransformPromiseNodeBase::dropDependency() at ??:0:0
+
+
+kj::_::TransformPromiseNode<workerd::api::DeferredProxy<void>, workerd::api::DeferredProxy<void>, workerd::api::ServiceWorkerGlobalScope::request(kj::HttpMethod, kj::StringPtr, kj::HttpHeaders const&, kj::AsyncInputStream&, kj::HttpService::Response&, kj::Maybe<kj::StringPtr>, workerd::Worker::Lock&, kj::Maybe<workerd::api::ExportedHandler&>)::$_8, workerd::api::ServiceWorkerGlobalScope::request(kj::HttpMethod, kj::StringPtr, kj::HttpHeaders const&, kj::AsyncInputStream&, kj::HttpService::Response&, kj::Maybe<kj::StringPtr>, workerd::Worker::Lock&, kj::Maybe<workerd::api::ExportedHandler&>)::$_9>::destroy() at ??:0:0
+
+kj::_::TransformPromiseNodeBase::dropDependency() at ??:0:0
+
+kj::_::TransformPromiseNode<kj::_::Void, workerd::api::DeferredProxy<void>, workerd::WorkerEntrypoint::request(kj::HttpMethod, kj::StringPtr, kj::HttpHeaders const&, kj::AsyncInputStream&, kj::HttpService::Response&)::$_6, kj::_::PropagateException>::destroy() at ??:0:0
+
+kj::_::ExclusiveJoinPromiseNode::Branch::~Branch() at ??:0:0
+kj::_::ExclusiveJoinPromiseNode::~ExclusiveJoinPromiseNode() at ??:0:0
+
+kj::_::TransformPromiseNodeBase::dropDependency() at ??:0:0
+
+
+
+
+kj::_::ChainPromiseNode::~ChainPromiseNode() at ??:0:0
+
+kj::_::AttachmentPromiseNodeBase::dropDependency() at ??:0:0
+
+kj::_::AttachmentPromiseNode<kj::_::Deferred<workerd::WorkerEntrypoint::request(kj::HttpMethod, kj::StringPtr, kj::HttpHeaders const&, kj::AsyncInputStream&, kj::HttpService::Response&)::$_8> >::destroy() at ??:0:0
+
+kj::_::TransformPromiseNodeBase::dropDependency() at ??:0:0
+
+kj::_::TransformPromiseNode<kj::Promise<void>, kj::_::Void, workerd::WorkerEntrypoint::request(kj::HttpMethod, kj::StringPtr, kj::HttpHeaders const&, kj::AsyncInputStream&, kj::HttpService::Response&)::$_9, kj::_::PropagateException>::destroy() at ??:0:0
+
+
+
+kj::_::AttachmentPromiseNodeBase::dropDependency() at ??:0:0
+
+kj::_::AttachmentPromiseNode<kj::_::Deferred<workerd::WorkerEntrypoint::request(kj::HttpMethod, kj::StringPtr, kj::HttpHeaders const&, kj::AsyncInputStream&, kj::HttpService::Response&)::$_10> >::destroy() at ??:0:0
+
+
+kj::_::TransformPromiseNode<kj::Promise<void>, kj::_::Void, kj::_::IdentityFunc<kj::Promise<void> >, workerd::WorkerEntrypoint::request(kj::HttpMethod, kj::StringPtr, kj::HttpHeaders const&, kj::AsyncInputStream&, kj::HttpService::Response&)::$_11>::destroy() at ??:0:0
+
+
+
 Error: Testing errors
     at loader12 (file:///tmp/tmp-xxx-xxx/xxx.js:53093:9)
     at async callRouteLoaderRR (file:///tmp/tmp-xxx-xxx/xxx.js:3419:16)
     at async callLoaderOrAction (file:///tmp/tmp-xxx-xxx/xxx.js:2532:16)
     at async Promise.all (index 2)
     at async loadRouteData (file:///tmp/tmp-xxx-xxx/xxx.js:2236:19)
     at async queryImpl (file:///tmp/tmp-xxx-xxx/xxx.js:2113:20)
     at async Object.query (file:///tmp/tmp-xxx-xxx/xxx.js:2065:18)
     at async handleDocumentRequestRR (file:///tmp/tmp-xxx-xxx/xxx.js:3574:15)
     at async file:///tmp/tmp-xxx-xxx/xxx.js:3700:294
     at async handleFetch (file:///tmp/tmp-xxx-xxx/xxx.js:32751:36) {
   stack: Error: Testing errors
     at loader12 (file:///tmp…p/tmp-xxx-xxx/xxx.js:32751:36),
   message: Testing errors
-}
 [mf:inf] GET / 500 Internal Server Error (323ms)
 [mf:inf] GET /favicon.ico 200 OK (9ms)

Setting LLVM_SYMBOLIZER=/bin/true suppresses the warning from symbolizer.c++, but not the (longer) error from jsg.c++.

@ngdangtu-vn
Copy link

ngdangtu-vn commented Oct 9, 2023

I got this problem when I did a simple static site testing (which has nothing to do with worker):

wrangler pages dev ./site
[mf:wrn] The latest compatibility date supported by the installed Cloudflare Workers Runtime is "2023-10-02",
but you've requested "2023-10-09". Falling back to "2023-10-02"...
workerd/util/symbolizer.c++:98: warning: Not symbolizing stack traces because $LLVM_SYMBOLIZER is not set. To symbolize stack traces, set $LLVM_SYMBOLIZER to the location of the llvm-symbolizer binary. When running tests under bazel, use `--test_env=LLVM_SYMBOLIZER=<path>`.
*** Fatal uncaught kj::Exception: kj/async-io-unix.c++:945: failed: ::bind(sockfd, &addr.generic, addrlen): Address already in use; toString() = 0.0.0.0:8788
stack: /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@39acfda /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@39acd7f /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@39aafe5 /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@18a3bf6 /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@18a4289 /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@18a49a4 /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@1854ecd /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@18592a7 /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@1859024 /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@185900c /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@39e4d2e /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@39e492d /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@39e2c68 /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@39e2a2a /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@184a299 /lib/x86_64-linux-gnu/libc.so.6@29d8f /lib/x86_64-linux-gnu/libc.so.6@29e3f /home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/node_modules/@cloudflare/workerd-linux-64/bin/workerd@184a02d
✘ [ERROR] MiniflareCoreError [ERR_RUNTIME_FAILURE]: The Workers runtime failed to start. There is likely additional logging output above.

After I restart the whole pc, it works. Anyway, I got the error above from running it in bunx, I think I got it from pressing d to open devtool in wrangler. here is what devtool printed out:

/home/user/.local/bin/volta/tools/image/packages/wrangler/lib/node_modules/wrangler/wrangler-dist/cli.js:29374
            throw a;
            ^

Error: spawn google-chrome ENOENT
    at ChildProcess._handle.onexit (node:internal/child_process:284:19)
    at onErrorNT (node:internal/child_process:477:16)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21)
Emitted 'error' event on ChildProcess instance at:
    at ChildProcess._handle.onexit (node:internal/child_process:290:12)
    at onErrorNT (node:internal/child_process:477:16)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
  errno: -2,
  code: 'ENOENT',
  syscall: 'spawn google-chrome',
  path: 'google-chrome',
  spawnargs: [
    'https://devtools.devprod.cloudflare.dev/js_app?theme=systemPreferred&ws=localhost%3A9229%2Fws&debugger=true'
  ]
}

@admah admah moved this from Untriaged to Backlog in workers-sdk Oct 16, 2023
@admah
Copy link
Contributor

admah commented Oct 16, 2023

Sorry for the delay on getting back to this issue. We are taking a look at what the root cause of this could be.

@jorgemartins-uon
Copy link

Happening all the time here as well, we can't use it anymore, unfortunately. Any work around in the meantime?

@kevinmarrec
Copy link

For now I only got this error when dealing with Durable Objects & its WebSockets API.
Sounds like to be triggered by socket closing, related : cloudflare/workerd#1299

There's probably many scenarios where it could fall into :
https://github.com/cloudflare/workerd/blob/main/src/workerd/util/symbolizer.c%2B%2B#L93

Hope we found a solution for this 🙏, or at least something we can do to suppress these error outputs (which sound to be false-positive errors only in development ?).

@admah
Copy link
Contributor

admah commented Oct 27, 2023

To update this issue, these warnings are coming from the workerd runtime in dev mode. We're working to suppress these since they are not actionable by users.

@pencilcheck
Copy link

pencilcheck commented Oct 31, 2023

I read the previous comment but it would be nice to know what temporary workaround actions we can do to fix this issue for now while waiting for update. I assume one the fix is pushed we just need to update wrangler npm package right?

@erhathaway
Copy link

For me, this issue was accompanied by another issue: cloudflare/workerd#1401 (comment)

When I fixed that issue, this one was fixed as well...

in wrangler.toml

[dev]
ip = "127.0.0.1"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something that isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.