-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Flaky optimism p2p::can_sync test #8166
Comments
we can simply remove the error log here |
The test is timing out though, so we need to do something else to fix the test probably |
This is because you can't have more than one threadpool per process, I ran into this with one of the benchmarks we hadn't kept up to date, too. Not entirely sure what the best way to circumvent this is Edit: Actually this is a bit different, but might be a similar error. I ran into this with Tokio (the benchmark tool spawned a tokio runtime, after which the function we called did the same) |
Actually the flake might be more related to
But I still need to repro |
Yeah reading the threadpoolbuilder docs, it returns an error but does not mention anything about it being fatal or stalling, so should be ok: https://docs.rs/rayon/latest/rayon/struct.ThreadPoolBuilder.html#method.build_global |
That specific error log is expected and part of the e2e test @Rjected |
@joshieDo do you mind taking a look at this? I haven't run into this in a while though, so it may just be difficult to repro |
This issue is stale because it has been open for 21 days with no activity. |
I believe it's not an issue anymore, please re-open if occurs again. |
A recent merge queue run failed in a way that indicates we have something flaky in
p2p::can_sync
:https://github.com/paradigmxyz/reth/actions/runs/9007839847/job/24748566483
The important logs:
We might need something lazy for the thread pool, or it might have to do with launching multiple nodes? Have not debugged this in depth
The error is thrown here:
reth/crates/node/builder/src/launch/common.rs
Lines 110 to 126 in d467744
The text was updated successfully, but these errors were encountered: