-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@stream and @defer: Asynchrony, laziness, or both? #691
Comments
BUT The addition of |
@nodkz Technically @lazy could be implemented without state if we standardized identity (ie defined a way of retrieving additional object data by id in a second roundtrip). However stateless laziness wouldn't be especially useful for the use case of optimistic preloading due to latency. Therefore it seems the only sensible implementation of @lazy would be stateful. Currently we're exploring reusing the spec concepts introduced for Subscriptions for incremental delivery. |
@jhusain What are the differences between using Using your example, what is the difference between:
Maybe this example is too simplistic and does not fully demonstrate the advantages of |
The difference between 1. and 2. would be if you have other common configuration. For example:
It becomes much more complicated to analyze the query and identify parts that are equal to merge them together. If it's just one query then there is a structural guarantee that everything is shared for the different requested fields. |
As we explore the incremental delivery solution space it's useful to consider the following orthogonal axes independently:
This issue attempts to highlight the difference between these options, highlight the applicable use cases for each, and propose an approach to allow standardizing the two separately.
Asynchronous data delivery
As specified today the @defer and @stream directives grant permission to the server to deliver data asynchronously across multiple responses. However the server can choose to ignore this hint and deliver all requested data together synchronously in the initial response.
There are two use cases for asynchronous data delivery:
Asynchronous delivery can improve TTI for both use cases. However applications which use incremental delivery for preloading do so at the expense of increased cost. The reason for this is that data preloaded asynchronously must be delivered but may not be displayed. For optimistic preloading, laziness can provide a smoother trade-off between improved TTI and cost.
Optimistic preloading with lazy data delivery
In order to reduce the cost of optimistically preloading data, we could introduce a lazy directive. The @lazy directive is a hint to the server that the client will send an explicit signal when a fragment in a requested query is needed. Consequently the server can choose to lazily resolve the fragment until the client sends a signal indicating that the data is needed. Notably @lazy gives the server the option to not resolve the fragment at all if it does not receive a signal from the client that the fragment is needed.
Consider the following query:
At build a time the type of the query response would be as follows:
Notice that the use of a Promise is appropriate for @defer, because Promises are eagerly evaluated. Promises perform their work on creation time rather than when their data is requested by the consumer. As we see, @defer forces the server to pay the cost of resolving data regardless of whether it used.
Alternately clients can give a hint to the server that they can choose to lazily resolve a subset of requested data using a hypothetical @lazy directive:
This changes the type of the query response as follows:
Note that the introduction of a thunk forces the client to invoke a method in order to retrieve the data. When the method is invoked, the client would send a message to the server indicating that it was necessary for the client to resolve the requested data. Presumably a useful implementation of @lazy would necessitate a duplex transport protocol. Servers without duplex support would simply resolve the data eagerly.
At first glance lazy loading may seem to be in direct conflict with preloading. However recall that @lazy, like @defer, is a hint to the server. During peak time the server may opt to reduce cost by waiting until receiving an explicit signal from the client that the data is needed before lazily loading it. However when capacity is cheap the server may choose to eagerly load the data and send it to the client as soon as it is available. Notably this means that the server may never resolve the requested data unless it is explicitly requested by the client.
How should we proceed?
Hopefully the examples in this issue demonstrate the difference between asynchrony and laziness in incremental delivery. This strawman @lazy proposal is just one option. We could also add parameters to the @defer and @stream directives to control laziness. The metapoint here is that whichever approach we take, we should ensure that we have a path to support laziness in the future.
The text was updated successfully, but these errors were encountered: