-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[proposal] Add @concurrent directive for types #3203
Conversation
I reworded the PR description from your initial issue. Please confirm it matches your intention. |
@StevenACoffman confirm |
@StevenACoffman сan you confirm that these changes are relevant for your library and are accepted? |
Thank you! I appreciate that you gave people the ability to opt-in to the current behavior that they have been relying on. I've been a little busy at work lately, so I'm sorry I haven't been able to be as responsive as usual. |
Are there any benchmarks or data collected on what impact does this change have for services under high load? |
@krupyansky @StevenACoffman This merge broke dataloaders behind field resolvers as in dataloadgen-example for so many users in It took me some time to find out that field resolvers are not executed concurrently anymore and you have to set the newly introduced and undocumented So, from now, if you want to use dataloader pattern behind field resolvers, you have these important places for preparation:
Honestly, this adds pretty much complexity. Field resolvers are meant for heavy work like fetching from database. Dataloaders are very common to get rid of N+1 problem. One missing
Here is an example with pagination and a field resolver for dataloader pattern which gets really complex right now:
directive @concurrent on OBJECT
type Query {
getBooks(first: Int!, after: String): BookConnection
}
type BookConnection {
totalCount: Int!
# `@concurrent` at `BookEdge` makes iteration of this array concurrent, if not done,
# dataloader behind `Book.writer` and `Book.publisher` cannot work
edges: [BookEdge!]!
pageInfo: PageInfo!
}
# add @concurrent to make marshalling `BookEdge` concurrent
# in `BookConnection.edges`, so that dataloader behind `Book.Writer` and `Book.Publisher`
# can work at all
type BookEdge @concurrent {
cursor: String!
node: Book!
}
# add @concurrent to make `writer` and `publisher` fetchable in parallel
type Book @concurrent {
id: ID!
name: String!
isbn: String!
writer: Writer!
publisher: Publisher!
}
type Writer {
id: ID!
name: String!
}
type Publisher {
id: ID!
name: String!
}
models:
Book:
fields:
# make Book.writer a field resolver
writer:
resolver: true
# make Book.writer a field resolver
publisher:
resolver: true |
Thanks for the note, @razorness. I accidentally discovered this behavior today when debugging a failed test, eventually leading me here. I find the consequences in regard to dataloaders pretty devastating. I think this should be implemented in a non-breaking way, making the changed behavior opt-in, not the previous implementation. I didn't find a ticket addressing the issue, so I created one: #3285 @krupyansky @StevenACoffman I'd appreciate your thoughts on the matter. Maybe we move the discussion to the new ticket? |
@StevenACoffman @razorness @Metamogul i made fix Please give me review |
We use one goroutine for every element of a list when marshaling the GraphQL object. This performs badly on load tests. We should use goroutines only for network calls, certain other system calls, or expensive parallel calculations. Marshaling isn't an expensive calculation.
In this PR, I propose adding a "@Concurrent" directive that controls this behavior. Lists that have the concurrent type (with
@concurrent
directive) will have generated code as is now. But lists without the concurrent type (without @Concurrent directive) will not have concurrent unmarshalling code.Added pull request for issue #3202