-
Notifications
You must be signed in to change notification settings - Fork 12.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Do not use ParamEnv::and
when building a cache key from a param-env and trait eval candidate
#95031
Conversation
that seems like pretty surprising behavior to me. idk what I would expect to happen here, but going from !global to global seems pretty prone to cause similar errors in the future. Should probably look at this code myself 😅 Want to know whether it is possible to instead change the representation we use in the selection cache so that we can continue using |
Yeah, it's somewhat surprising how we get here. It has to do with the fact that generator interiors have their regions erased and replaced with late-bound regions, and then how we normalize projections under binders, and then how we canonicalize and then replace canonical placeholders with region variables, and then finally how we replace region variables with ReErased when caching... Still trying to fully understand why this is the only way to trigger this issue.
I am still skeptical that we should be using I could envision this going bad by allowing a trivially-falsifiable "almost global" bound -- i.e. one like, e.g., My thoughts is that we should always cache the evaluation result with exactly the param-env that we used to evaluate it. I'm having trouble seeing what benefit this (* = i'm independently interested in fixing the issue of almost-trivially-falsifiable bounds, since i see it come up often in other ICEs. working on it, though.) |
Sorry for the delay in getting to this. I also think that this behavior could cause issues elsewhere with |
6ef764f
to
8588f79
Compare
Added comment (and rebased) |
@bors r+ |
📌 Commit 8588f79 has been approved by |
☀️ Test successful - checks-actions |
Finished benchmarking commit (ec667fb): comparison url. Summary:
If you disagree with this performance assessment, please file an issue in rust-lang/rustc-perf. @rustbot label: -perf-regression |
Do not use
ParamEnv::and
to cache a param-env with a selection/evaluation candidate.This is because if the param-env is
RevealAll
mode, and the candidate looks global (i.e. it has erased regions, which can show up when we normalize a projection type under a binder1), then when we useParamEnv::and
to pair the candidate and the param-env for use as a cache key, we will throw away the param-env's caller bounds, and we'll end up caching a candidate that we inferred from the param-env with a empty param-env, which may cause cache-hit later when we have an empty param-env, and possibly mess with normalization like we see in the referenced issue during codegen.Not sure how to trigger this with a more structured test, but changing
check-pass
tobuild-pass
triggers the case that #94903 detected.1. That is, we will replace the late-bound region with a placeholder, which gets canonicalized and turned into an infererence variable, which gets erased during region freshening right before we cache the result. Sorry, it's quite a few steps.
Fixes #94903
r? @Aaron1011 (or reassign as you see fit)