-
Notifications
You must be signed in to change notification settings - Fork 30.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Assertion (wrap->ssl_) != nullptr
failed in TLSWrap::GetServername
#48000
Comments
I'm not aware of any recent change to |
The increase in errors seems to only affect https://ci.nodejs.org/manage/computer/test-rackspace-fedora32-x64-1/builds |
According to the build history, the first failure was https://ci.nodejs.org/job/node-test-commit-linux/nodes=fedora-last-latest-x64/52042/ Commit: 75b0d9e |
@targos should we take it offline until this is resolved? |
I just took it offline. |
It's possible to reproduce on the machine using:
@tniessen Do you have an idea on how we can debug it? Do you want access to the machine? |
Reverting 2d24b29 fixes the error. |
/cc @ShogunPanda |
After reverting 2d24b29, the request at https://github.com/nodejs/node/blob/8b3777d0c82c01229e724d84586fdc472fd4deda/tools/doc/versions.mjs#LL44C25-L44C31 fails with |
@targos Can I have access to the machine so I can debug it? |
@ShogunPanda Sure. Can you open an access request on |
@targos Done: nodejs/build#3354 |
If it helps, I've been running into this in my CI and can repro in an ubuntu:20.04 Docker container on my macOS laptop:
Something about the large set of devDependencies in the package-lock.json file result in |
@trentm yup, that is helpful. I hope to have an answer soon. |
This is also happening in GitHub actions, see, for example, this
|
See this for a test case that should reliably reproduce the crash on v21 or ERR_SOCKET_CONNECTION_TIMEOUT on v20. |
Will wait for @silverwind to verify latest main fixes this and then I'll close. |
Will verify tomorrow once a nightly build is available. |
Results from
See this for stack trace etc. |
The |
Version
HEAD
Platform
fedora-last-latest-x64
Subsystem
tls
What steps will reproduce the bug?
Start a Jenkins CI job that includes fedora-last-latest-x64.
How often does it reproduce? Is there a required condition?
It happens quite often. The build time graph shows a clear increase in error rates during the last few days:
What is the expected behavior? Why is that the expected behavior?
No error.
What do you see instead?
Additional information
This only started happening recently. Unless there was an infrastructure change (cc @nodejs/build), it likely is due to a recent change on main.
The text was updated successfully, but these errors were encountered: