-
Notifications
You must be signed in to change notification settings - Fork 205
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve context schema rules for await
#3571
Comments
Sounds good! Here's one perspective that supports the claim that it is a relaxation (allowing more after the change than before, in a narrow sense): The context Soundness: It is always the operand of Ship it! 😄 |
The
does look unsound. The I'm still not particularly happy about loosening the typing rules for await. My preferred rule change would be The typing rules around (It wouldn't solve the issue with the original post's code, though.) |
I'm a little nervous about this proposal, because it pushes on an area where we are constrained by the limited set of union/intersection types we support. In general, unions and intersections may support a rich algebra (think e.g. along the lines of De Morgan's laws), and they may be defined to "distribute" over terms in rich ways. So for example, given a thing of type With We are considering the case where we have
Note that each of these types is expressible given our current set of combinators. However, the composition of them (which is, in some sense the "correct" context type for
We can re-distribute the union types in the above without losing information by taking the following steps:
so by applying the above rewrites (not unique, unfortunately, but without losing information) we get to:
We can further simplify the above by observing that some of the arms of the union are subtypes of others, and hence can be folded in:
So a context type which covers the full set of possible expressions is:
Note that this cannot be simplified further without losing information (e.g. using Note also that this is not a unique decomposition:
I don't immediately see any way to do better here without losing information, and I don't see a good lossy supertype to use either. Based on this analysis then, one could imagine simply choosing one of the different arms of the simplified unions above to use as the context type:
The last seems clearly not useful. The second is what is being proposed here. How do these compare? Well, if we start from the full set of types that we enumerated above as the possible static types for The current implementation uses
That is, The proposal here is to use
If I've done this analysis correctly then, it seems that the proposed context type makes is strictly better than our current choice based on this criteria. Note that it is not clear to me that this is the only criteria we need to consider (e.g. we should think about how this might interact with inference). Finally, it's worth looking at the other option suggested by the analysis above, which is to arbitrarily choose the first arm of the simplified union type above for the context type, that is
Interestingly, this choice covers all of the cases where |
Thinking more, this feels like a self inflicted problem. The reason we get into this issue is that we have a special rule for a context type that is already So, the speciale case rule is why we have a problem, so maybe it's just not good enough. If we want to keep trying to find a sufficient approximation then: Given
It's not a given that it is always possible to find such a It's almost certain that
And let's look at the four two-level nested
(Where we notice that both is really How do those compare to what we have today:
where The underlying idea is that if you're awaiting something, that That is, I think there is a valid reason, derived from desired constraints on the special-case rule that we have introduced, to choose Which also means it won't necessarily generalize to other union types, if we had general union types. (But if we have general union types, we need to apply flatten to those too, so maybe the answer will always be |
Perhaps we should restart this discussion based on the assumption that we adopt #870. The main point of doing that is that it makes type inference simpler in // --- Glue code, needed to make the example compilable.
class UserEntry {
String? get nameOrNull => null;
}
Future<UserEntry> readUserEntryFromDatabase() async => UserEntry();
// --- Example from issue. Emulate #870.
Future<T> readUserDataFromDatabase<T>(
T Function(UserEntry) extractDataFromEntry) async {
UserEntry userEntry = await readUserEntryFromDatabase();
T result = extractDataFromEntry(userEntry);
return result;
}
Future<String> getUserNameOrDefault() async {
String result =
(await readUserDataFromDatabase((userEntry) => userEntry.nameOrNull)) ??
'Unknown name';
return result;
} No nested future types anywhere! |
I like the principled nature of @leafpetersen's analysis in #3571 (comment). When I was trying to write up the issue in the first place, I think I was trying to mentally reach for something like this, but I kept getting lost in the details and eventually gave up. Reading Leaf's comment makes me want to try again. But I'm going to throw a monkey wrench into it: I'm going to posit that in practice, for real-world Dart code, developers never intend for an Repeating Leaf's analysis with that assumption:
Taking the union of these types, we have Applying the same rewrites that Leaf used, we are left with
So if you accept my assumption (that our context for |
I agree. The point is to not have a future after But, if we want to be consistent, should we recursively process deeper nestings, like a context type of |
Yeah, I suppose we could do that. In fact, perhaps a principled way to think of this is, to perform type inference on an expression of the form * I'm not 100% sure this is true. It's certainly true for well-behaved code though. Throwing out any parts of
Then, I think the only simplification we need to do is to reduce So the complete rule would be: to perform type inference on an expression of the form
That would give the same results as my initial proposal, except that:
All 3 of these bullets seem like improvements to me. |
It's not true in general, but is true if the code never contains any The context doesn't need to be "sound", as long as we don't strictly enforce it. It just has to be helpful.
Should we generally consider a top-type as context type to be We may want to allow
Not sure that's an improvement. (Also not sure it isn't. Can't really wrap my head around it.) Come to think of it, maybe the context type for an In that case, remove all |
Imagine we have an asynchronous function that may or may not produce a result:
And we wrap it in an asynchronous function that supplies a default value:
Now imagine we want to make the original function more generic, e.g.:
This no longer works, and it's not obvious why from the error messages. Here are the error messages from the CFE:
It turns out that what's happening is this:
(await readUserDataFromDatabase(...)) ?? 'Unknown name'
) is being analyzed with a context schema ofFutureOr<String>
.??
expression (await readUserDataFromDatabase(...)
) is being analyzed with a context schema ofFutureOr<String>?
.await e
, in contextK
, involve analyzing the subexpressione
using the contextJ
, whereJ
is computed as follows:K
is_
ordynamic
*, thenJ
is_
.K
isFutureOr<T>
,FutureOr<T>?
, orFutureOr<T>*
, thenJ
isK
.*J
isFutureOr<K>
.readUserDataFromDatabase(...)
is analyzed using a context schema ofFutureOr<String>?
readUserDataFromDatabase
is being inferred asString
, even thoughString?
is what was intended.*Note that the behavior of star types is not really important anymore, because pre-null-safety code is no longer supported; but support for star types hasn't been removed from the CFE yet, so I'm including the full rule for completeness.
I would like to modify the context schema rules for
await
to this:K
is_
ordynamic
, thenJ
is_
.K
isFutureOr<T>
, thenJ
isK
.K
isFutureOr<T>?
, thenJ
isFutureOr<T?>
.K
isFutureOr<T>*
, thenJ
isFutureOr<T*>
.J
isFutureOr<K>
.Once again I'm including information about star types for completeness.
This would allow the code example above to be accepted, because
readUserDataFromDatabase(...)
would be analyzed using a context schema ofFutureOr<String?>
.Note that the only difference is in the behavior if
K
isFutureOr<T>?
(orFutureOr<T>*
). I think the change is justified, because in this situation, theawait e
expression is in a context that permitsnull
, therefore it makes sense that the context fore
should permit a future that completes withnull
.As with any change to type inference, this would be a breaking change in theory, because any change to an inferred type could lead to cascading changes to other inferred types, which could in turn lead to a static error in code that would previously have been accepted. However, I've prototyped the change in google3 (in the analyzer only) and found no breakages. So I think the risk of breaking real-world code is low.
An additional reason we might want to consider this change is that if we ever decide to go ahead with #3471 (require every expression to satisfy its context), then we will need to revise the context schema rules for
await
in this way, in order to avoid serious breakage.I do think that if we make this change, it would be reasonble to do it in a language-versioned way, so that after the change, if anyone writes code like the example above, they won't risk publishing it to pub in a form that will break with older versions of Dart.
@dart-lang/language-team what do you think?
The text was updated successfully, but these errors were encountered: