Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transport-level query batching #531

Merged
merged 11 commits into from
Aug 30, 2016
Merged

Transport-level query batching #531

merged 11 commits into from
Aug 30, 2016

Conversation

Poincare
Copy link
Contributor

@Poincare Poincare commented Aug 11, 2016

Adds transport-level query batching support to Apollo Client (see #505).

TODO:

  • Update CHANGELOG.md with your change
  • Make sure all of the significant new logic is covered by tests
  • Rebase your changes on master so that they can be merged easily
  • Make sure all tests and linter rules pass

@Poincare Poincare changed the title [WIP] Transport batching Transport batching Aug 12, 2016
@Poincare Poincare changed the title Transport batching Transport-level query batching Aug 12, 2016
@stubailo
Copy link
Contributor

This is great. We should do some things before merging and shipping the feature:

  1. Which server implementations currently support batching? Of course, apollo-server. What about graphql-ruby, sangria, express-graphql with https://github.com/nodkz/react-relay-network-layer? Can we make this solution work with those as well? There is no spec for how this stuff should work, but perhaps we can build something close enough that people can use it with different servers. (see here as well: [WIP] Add rudimentary batched query support graphql/express-graphql#100)
  2. We should write documentation, both for how to use this feature on the client and how one might add this to their server (with appropriate links to various server docs etc)

@nodkz
Copy link

nodkz commented Aug 13, 2016

@nodkz I'm curious - what is the id for in your react-relay-network-layer? I think this is the right API and @Poincare is in the process of implementing batching for Apollo Client - would it be reasonable for us to just implement an array of requests and array of responses?

Shortly, id is the Relay request id.

When Relay provides a list of queries for fetching. I can pass it back in any order, but should know for which query which payload addressed. So I combine all queries with id into an array, then send them to the server. Server proceeds all queries and passes array payloads with its ids back. Client gets the response, traversing by the array and maps by id which payload corresponds to which relay query.
https://github.com/nodkz/react-relay-network-layer/blob/master/src/relay/queriesBatch.js#L6

Example of batch request:

[
  {
    "id": "q0",
    "query": `
      query ViewerQueries {
        viewer {
          ...F0
        }
      }
      fragment F0 on Viewer {
        id,
        usersCount
      }
    `,
    "variables": {}
  },
  {
    "id": "q1",
    "query": "query ViewerQueries { ... }",
    "variables": {}
  }
]

Batch response:

  { 
    "id":"q0", 
    "payload":{
      "data": {
        "viewer": {
          "id": "Q2FiaW5ldDo1N2FjOWE2NDVhZTUxYTI1ZmYxZGNkMzg=",
          "usersCount": 0,
        }
      }
    } 
  },
  { 
    "id":"q1", 
    "payload": {
      "data": null,
      "errors": [ ... ]
    } 
  }
]

@stubailo
Copy link
Contributor

OK, so the idea is that Relay batching is built with the idea that requests might arrive out of order or separately, even if they were sent as a batch? This seems like it would never come up with HTTP batching, right? So maybe we could remove the id from the HTTP batching implementation?

@nodkz
Copy link

nodkz commented Aug 18, 2016

@stubailo if graphql/express-graphql#100 will be accepted, then I'll remove id from request, cause express-graphql will guarantee order of returned payloads according to queries.

But if I realize network layer via WebSockets, then I'll return ids back. To have ability pass fetched data to relay, while other queries in progress.

@Poincare
Copy link
Contributor Author

Poincare commented Aug 18, 2016

I'm not sure if batching is necessary at all over a WebSocket transport since the concept of a "HTTP roundtrip" no longer exists.

@nodkz
Copy link

nodkz commented Aug 18, 2016

Nice catch. @Poincare totally agreed with you.

@nodkz
Copy link

nodkz commented Aug 18, 2016

Ooops. Not agree again.

Suppose the following scenario: We have 4 queries from relay q1, q2, q3, q4, and put them all to WebSocket.
We should wrap it in something with id that will be correlated returned payloads in different order, e.g. p3, p1, p4, p2.

PS.
I tell about adding id to queries in WebSockets, not batching.
But in the batching via HTTP we can eliminate ids, if returned array will have same order.

export function createNetworkInterface(
uri: string,
opts: RequestInit = {},
transportBatching = false
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think just having this as a third argument is a good idea. I'd replace uri with NetworkInterfaceOptions or something.

So you can do createNetworkInterface('/graphql') or createNetworkInterface({ uri: '/graphql', serverBatching: true })

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should opts be a part of NetworkInterfaceOptions or should it be a second argument? For example, would you do:

createNetworkInterface({
  uri: '/graphql',
  opts: { ... },
  serverBatching: true,
})

or

createNetworkInterface({
  uri: '/graphql',
  serverBatching: true,
},
{ ... })

The first option makes more sense to me but it probably breaks existing code that uses createNetworkInterface.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can do the first one in a backwards-compatible way by checking if the first argument is a string. then for 1.0 we can remove the backcompat?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good to me.

return addQueryMerging(new HTTPFetchNetworkInterface(uri, opts)) as HTTPNetworkInterface;
const {
transportBatching = false,
opts = {},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we change this to a more descriptive name since before it was only internal? Like fetchOptions.

@helfer
Copy link
Contributor

helfer commented Aug 27, 2016

@Poincare @stubailo what's left to do here other than rebasing? If possible, I want to merge this next week.

@Poincare
Copy link
Contributor Author

Poincare commented Aug 28, 2016

This PR works (tested on GitHunt), but if we merge the typescript compiler options PR first, it will require a few changes so it is waiting for that.

@helfer helfer self-assigned this Aug 30, 2016
@helfer helfer force-pushed the transport_batching branch from a967123 to 4418faa Compare August 30, 2016 23:02
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants