-
Notifications
You must be signed in to change notification settings - Fork 12.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fallback for diverging expressions leaks into the ?
operator
#39297
Comments
Maybe I'm not sure what you're saying here but as far as I can tell there'll still be some breakage. What about this for instance: trait Deserialize: Sized {
fn deserialize() -> Result<Self, String>;
}
impl Deserialize for () {
fn deserialize() -> Result<(), String> {
Ok(())
}
}
impl Deserialize for ! {
fn deserialize() -> Result<!, String> {
Err("Failed to deserialize a `!`")
}
}
fn doit() -> Result<(), String> {
let _ = <_ as Deserialize>::deserialize()?;
Ok(())
}
fn main() {
let _ = doit();
} Pre- |
So @eddyb and I were talking on IRC and we had come to the conclusion that something wacky is going in the LUB code. In particular, if this type-checks (and it does): fn main() {
let x = if true { 22 } else { return; 'a' };
} then we ought not to be considering the type of diverging arm at all. (Oddly, that code fails if you add |
@eddyb notes that we could remove these lines to correct the behavior with an explicit type annotation. |
I'm pretty sure the whole |
I've made the change and tested it. Note that these fn main() {
&panic!()
} fn f() -> isize {
(return 1, return 2)
}
fn main {} |
As per comment: rust-lang#39297 (comment)
Ignore expected type in diverging blocks As per comment: #39297 (comment)
I'm going to close this bug in favor of #66173 -- at this point, the existing fallback strategy is not changing, so the only question is whether we can lint about some of its more surprising cases (including the motivating example for this issue). |
This curious example was found by @canndrew:
This fails to compile. In particular, the
_
is inferred to()
(at present) rather thani32
. This is because of the interaction of two things:Self
type, for some sort of arbitrary reason;?
desugars into amatch
where one of the arms has areturn
; the type of thisreturn
thus has a diverging default.Since there are no other constraints on the type of
_
, this winds up defaulting to()
. Once thenever-type
work completes, it will default to!
.It's not entirely clear that this is a bug -- each part sort of makes sense -- but the result is pretty confounding. Some of the work on improving the trait system I've been doing would lead to this example compiling, because the
_
would be inferred toi32
.Note that there are variants of this which do compile (because of the fallback to
()
) -- i.e., if you changed the impl to be implemented for()
. In this case, changing the fallback (to!
) without improving the trait system's inference leads to a regression, since we fail to infer that()
was the right answer all along.I think that improving the trait system's inference does not lead to any breakage (since the default never kicks in). The basic reasoning is that, if the code compiled with a defualt before, but now compiles with improved inference, then the trait system must infer the same thing as the default, since otherwise there'd be ambiguity and it should not have done any inference (put another way, if it found another answer, then the default should have led to a compilation error).
That said, I think we should stop desugaring
?
when we lower to HIR, and instead do it when we lower to MIR. This would be helpful for implementingcatch
, and would also give us more control over how the typing works. I think it's quite surprising the way the "divergence" is hidden in this example.cc @eddyb @aturon, with whom I've discussed related issues
The text was updated successfully, but these errors were encountered: