Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

@stream and @defer: Asynchrony, laziness, or both? #691

Open
jhusain opened this issue Feb 25, 2020 · 4 comments
Open

@stream and @defer: Asynchrony, laziness, or both? #691

jhusain opened this issue Feb 25, 2020 · 4 comments

Comments

@jhusain
Copy link

jhusain commented Feb 25, 2020

As we explore the incremental delivery solution space it's useful to consider the following orthogonal axes independently:

  • Asynchronous versus synchronous data delivery
  • Lazy versus eager data resolution

This issue attempts to highlight the difference between these options, highlight the applicable use cases for each, and propose an approach to allow standardizing the two separately.

Asynchronous data delivery

As specified today the @defer and @stream directives grant permission to the server to deliver data asynchronously across multiple responses. However the server can choose to ignore this hint and deliver all requested data together synchronously in the initial response.

There are two use cases for asynchronous data delivery:

  1. Deferring data. Improving time-to-interactive (TTI) by delaying the delivery of less critical data which is nonetheless visible on the current view (ex. a chat window which appears in the bottom left of the screen after the rest of the content is loaded)
  2. Preloading data. Speculatively preloading data which may never be displayed, either because a user does not scroll the information into the viewport or visit a view which displays the information.

Asynchronous delivery can improve TTI for both use cases. However applications which use incremental delivery for preloading do so at the expense of increased cost. The reason for this is that data preloaded asynchronously must be delivered but may not be displayed. For optimistic preloading, laziness can provide a smoother trade-off between improved TTI and cost.

Optimistic preloading with lazy data delivery

In order to reduce the cost of optimistically preloading data, we could introduce a lazy directive. The @lazy directive is a hint to the server that the client will send an explicit signal when a fragment in a requested query is needed. Consequently the server can choose to lazily resolve the fragment until the client sends a signal indicating that the data is needed. Notably @lazy gives the server the option to not resolve the fragment at all if it does not receive a signal from the client that the fragment is needed.

Consider the following query:

{
  viewer {
    name @defer
    age
  }
}

At build a time the type of the query response would be as follows:

{
  viewer: {
    name: Promise<string>
    age: number
  }
}

Notice that the use of a Promise is appropriate for @defer, because Promises are eagerly evaluated. Promises perform their work on creation time rather than when their data is requested by the consumer. As we see, @defer forces the server to pay the cost of resolving data regardless of whether it used.

Alternately clients can give a hint to the server that they can choose to lazily resolve a subset of requested data using a hypothetical @lazy directive:

{
  viewer {
    name @lazy @defer
    age
  }
}

This changes the type of the query response as follows:

{
  viewer: {
    name: () => Promise<string>
    age: number
  }
}

Note that the introduction of a thunk forces the client to invoke a method in order to retrieve the data. When the method is invoked, the client would send a message to the server indicating that it was necessary for the client to resolve the requested data. Presumably a useful implementation of @lazy would necessitate a duplex transport protocol. Servers without duplex support would simply resolve the data eagerly.

At first glance lazy loading may seem to be in direct conflict with preloading. However recall that @lazy, like @defer, is a hint to the server. During peak time the server may opt to reduce cost by waiting until receiving an explicit signal from the client that the data is needed before lazily loading it. However when capacity is cheap the server may choose to eagerly load the data and send it to the client as soon as it is available. Notably this means that the server may never resolve the requested data unless it is explicitly requested by the client.

How should we proceed?

Hopefully the examples in this issue demonstrate the difference between asynchrony and laziness in incremental delivery. This strawman @lazy proposal is just one option. We could also add parameters to the @defer and @stream directives to control laziness. The metapoint here is that whichever approach we take, we should ensure that we have a path to support laziness in the future.

@jhusain jhusain changed the title Stream/defer: Async vs. Lazy @stream and @defer: Asynchrony, laziness, or both? Feb 25, 2020
@nodkz
Copy link

nodkz commented Feb 25, 2020

@lazy will be a quite useful directive. And I like it!

BUT @lazy directive will add some state management on the server-side. For now Query and Mutations are stateless. And if we want to make lazy_call() we should request the same server which keeps user context data, query & variables. Almost like it happens with Subscriptions.

The addition of @lazy directive will make Query and Mutations stateful. Or I missing something? Or maybe we can somehow reuse Subscriptions for this case?

@jhusain
Copy link
Author

jhusain commented Feb 25, 2020

@nodkz Technically @lazy could be implemented without state if we standardized identity (ie defined a way of retrieving additional object data by id in a second roundtrip). However stateless laziness wouldn't be especially useful for the use case of optimistic preloading due to latency. Therefore it seems the only sensible implementation of @lazy would be stateful. Currently we're exploring reusing the spec concepts introduced for Subscriptions for incremental delivery.

@Victor-Savu
Copy link

Victor-Savu commented Mar 13, 2020

@jhusain What are the differences between using @lazy and just sending a request for the specific data?

Using your example, what is the difference between:

  1. sending:
    {
      viewer {
        name @lazy
        age
      }
    }
    and then sending a signal to resolve the name field; and:
  2. sending
    {
      viewer {
        age
      }
    }
    and then sending:
    {
      viewer {
        name
      }
    }

Maybe this example is too simplistic and does not fully demonstrate the advantages of @lazy. Could you give an example that maybe clarifies the distinction between the two approaches?

@valentin-panalyt
Copy link

The difference between 1. and 2. would be if you have other common configuration. For example:

{
  viewer(... a lot of common parameters like filters ...) {
    ...
  }
}

It becomes much more complicated to analyze the query and identify parts that are equal to merge them together. If it's just one query then there is a structural guarantee that everything is shared for the different requested fields.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants