-
Notifications
You must be signed in to change notification settings - Fork 648
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
proto-loader: Add TypeScript generator #1474
proto-loader: Add TypeScript generator #1474
Conversation
It would be very useful to have this covered by a baseline test (to have a generated |
The new draft is out in version 0.6.0-pre2 |
The last commit changes a few things:
The new version is now available in version 0.6.0-pre4 of the library. |
I'm giving this a spin and noticed the following problems: Using this proto file: // echo_service.proto
syntax = "proto3";
message EchoMessage {
string value = 1;
int32 value2 = 2;
}
service EchoService {
rpc Echo (EchoMessage) returns (EchoMessage);
rpc EchoClientStream (stream EchoMessage) returns (EchoMessage);
rpc EchoServerStream (EchoMessage) returns (stream EchoMessage);
rpc EchoBidiStream (stream EchoMessage) returns (stream EchoMessage);
} The generated client interface seems ok, but there's a couple of issues with the server interface. Here's the generated code: export namespace ServiceHandlers {
export namespace EchoMessage {
}
export interface EchoService {
Echo(call: grpc.ServerUnaryCall<messages.EchoMessage__Output>, callback: grpc.SendUnaryData<messages.EchoMessage>): void;
EchoClientStream(call: grpc.ServerReadableStream<messages.EchoMessage__Output>, callback: grpc.SendUnaryData<messages.EchoMessage>): void;
EchoServerStream(call: grpc.ServerWriteableStream<messages.EchoMessage__Output, messages.EchoMessage>): void;
EchoBidiStream(call: grpc.ServerDuplexStream<messages.EchoMessage__Output, messages.EchoMessage>): void;
}
} Issues:
Tested with:
Added a PR to fix these issues here: murgatroid99#1 |
Fix some type issues in the server interface
…atroid99/grpc-node into proto-loader_type_generator
@badsyntax Thank you for the fixes. I published the changes to 0.6.0-pre6. |
@alexander-fenster Do you have a suggestion of an existing |
@murgatroid99 We use https://github.com/googleapis/gapic-showcase/blob/master/schema/google/showcase/v1beta1/echo.proto for testing our client library generators. It covers pretty much all the use cases supported by Google APIs. |
Giving this a go, pretty much exactly what I've been looking for. I'm running into a couple of issues where I've just been going off some documentation I've found here https://github.com/grpc/proposal/pull/183/files/577fa399cb28dbde07b41896588f2e6998699a8f so please correct me If I'm wrong.
const pkg = grpc.loadPackageDefinition(definition) as ProtoGrpcType;
server.addService(pkg.user.UserService.service, handler); Here is the handler that the above is complaining about where I try to add it as a service implementation const handler: ServiceHandlers.user.UserService = {
async findById(call, callback) {
const user = await prisma.user.findOne({
where: {
id: call.request?.id,
},
});
callback(null, user);
},
}; Here is the proto file I've been using to experiment with syntax = "proto3";
package user;
service UserService {
rpc findById (FindByIdRequest) returns (User) {}
}
message FindByIdRequest {
string id = 1;
}
message User {
string id = 1;
string first_name = 2;
string last_name = 3;
string email = 4;
bool is_verified = 5;
} |
I had something similar, but it re-exported everything from the top level file, and that added a lot of complexity so I removed it. Re-exporting just the request and response types along with the corresponding service is an interesting idea, but I am concerned about exporting the same type in multiple places with different names, and only doing that with some message types. In particular, that leads to a situation where you can reference the request or response type of a method using one naming convention (e.g. My own experience with using the generated code has been that the VSCode automatic import generation removes a lot of the friction of the current setup. Of course, not everyone uses VSCode, but I expect that other IDEs have similar functionality. In addition, in some cases TypeScript's type inference can allow you to avoid explicitly referencing the types at all. |
Totally understand. I was able to convince TypeScript to provide this information to me. In case anyone else is interested this is how I extract the request/response types from a service client type: type Callback<T> = (err?: grpc.ServiceError, res?: T) => void;
type ServiceClientIO<C> = C extends () => grpc.ClientDuplexStream<infer I, infer O>
? [I, O]
: C extends (args: infer I) => grpc.ClientReadableStream<infer O>
? [I, O]
: C extends (cb: Callback<infer O>) => grpc.ClientWritableStream<infer I>
? [I, O]
: C extends (args: infer I, cb: Callback<infer O>) => grpc.ClientUnaryCall
? [I, O]
: never;
type API = MyServiceAPIClient;
type Req = {[P in keyof API]: ServiceClientIO<API[P]>[0]};
type Res = {[P in keyof API]: ServiceClientIO<API[P]>[1]}; Then I can use it like so to get the request/response types type GetSomeMethodReqest = Req['GetSomeMethod'];
type GetSomeMethodResponse = Res['GetSomeMethod']; The nice part about using the |
I think you can replace your |
It looks like that interface should work, but for some reason changing out my |
Hey @murgatroid99 Do you need support on this PR? I'd be willing to do a code review. I'm very familiar with code generation + typescript but not with protobuf js specifically. We're using gRPC for a new node service and would much prefer using this over grpc-tools protobuf lib |
@IanEdington I would appreciate it if you reviewed this code. |
@murgatroid99 I found a potential issue while working on a service with multiple versions, or maybe I'm simply doing something wrong, but I looked at the source code and I have an impression that it might be a bug. Given these two proto files: syntax = "proto3";
package test.name.v1alpha1;
message Request {}
message Response {}
service MyService { rpc GetResponse(Request) returns (Response); } syntax = "proto3";
package test.name.v1alpha2;
message Request {}
message Response {}
service MyService { rpc GetResponse(Request) returns (Response); } (note the version change) And the following command: proto-loader-gen-types \
--bytes array \
--defaults false \
--grpcLib @grpc/grpc-js \
--includeComments \
--includeDirs ./protos \
--keepCase false \
--longs number \
--oneofs false \
--outDir ./src/protos \
test/name/v1alpha1/my_service.proto \
test/name/v1alpha2/my_service.proto The root
There's not |
@merlinnot The code generator outputs files into the output directory with a name based on the base filename of the input files. It is not set up to handle multiple input files with the same base filename, as you have there. I'm not sure what would be the right way to handle that, but you can easily work around the problem by running two separate generator commands with different output directories. |
HI @murgatroid99 , our team has been using this MR/pre-release on services we are currently building. TL;DR it's been great and I think this MR adds tremendous value. Here's a couple of feedback points and things to possibly consider (or ignore, that's fine too!) : Index / Barrels barrels / index files would be extremely useful to simplify type imports and avoid multiple import lines. What's the intent in generating a ts file per message instead of a file per package?
instead of
We currently use barrelsby as a build step to make those index files, a process that can be clunky and error prone , especially if we want to make a top level index files, where now you expose yourself to name clashes between your own protos and vendored ones like the google common apis. Possibly this workaround is enough. Helpful loader typing The grpc-loader types themselves could potentially benefit from small utilities to perform the service typecast for the loaded package. We ended up doing this to load our service protos to make it more dev friendly:
Access to the pbjs Root Currently when using the grpc-loader, the loaded pbjs Root object is not easily retrievable. This could be useful in cases where you need the pbjs definitions for message in the root for things like *__Output types Wasn't very clear to team members what the purpose/use of the __Output types was by default. I'm not certain EnumTypeDefinition included even when unused In the cases where you compile your Typescript with Thanks for the great pr! -b |
It seems like most of the files is named after a package name? All request/response/service interfaces seem to behave like this.
It seems to me that we should simply merge these objects instead of overwriting, since we already have objects structured after package names, this would work great: export interface ProtoGrpcType {
test: {
name: {
v1alpha1: {
Request: MessageTypeDefinition
Response: MessageTypeDefinition
MyService: SubtypeConstructor<typeof grpc.Client, _test_name_v1alpha1_MyServiceClient> & { service: ServiceDefinition }
}
v1alpha2: {
Request: MessageTypeDefinition
Response: MessageTypeDefinition
MyService: SubtypeConstructor<typeof grpc.Client, _test_name_v1alpha2_MyServiceClient> & { service: ServiceDefinition }
}
}
}
} Even the imports (
That's what I'm currently doing, however it leads to a very unnatural structure of the files (not similar to any other gRPC generator). To clarify further:
The proposal is to merge This issue is easily reproducible with many Google libraries, which use more than one version (e.g. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey sorry for saying I would do something then not doing it.
We've moved away from using the protobufjs loader in favour of the grpc proto tools
Here's a partial review from when I was looking at it earlier. Also based on this conversation it looks like some of these will be out of date.
"protobufjs": "^6.8.6" | ||
"long": "^4.0.0", | ||
"protobufjs": "^6.10.0", | ||
"yargs": "^16.1.1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could any of these be *
dependencies, peer dependencies, or optional dependencies?
The yargs usage in this repo seems like it could be satisfied by any version of yargs. It might be worth using *
or >=
a lower version.
protobufjs seems like it would make the most sense as a peer dependency since you don't want 2 versions installed, only the version from the (I know you're change doesn't add this dep)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What would be the benefit of making these changes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would make it easier to manage dependencies for the users of this library.
If a user has version 15 of yargs defined in another package "*" would use that version instead of forcing 2 versions of yargs to exist, potentially causing conflicts.
You almost definitely don't want two versions of protobufjs
to exist in node_modules
since they would generate slightly different .js
/ .d.ts
files. Using a peer dependency would force this dependency to use whatever package was already defined in the package. If you have a 'peer' dependency it doesn't automatically install when doing yarn install
or the npm equivalent. Adding a protobufjs
dev
dependency alongside the peer
will make the local development experience smoother.
In general it's nice to work with libraries that are permissive about the packages they depend on, unless they need something specific for a version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The problem is that I know that the version of yargs I am currently using works as I am using it, but I can't guarantee that every possible version of yargs that might ever be published will work the same way. And the NPM package structure should prevent version conflicts, so I don't see that as a relevant concern.
As for protobufjs
, a peer dependency doesn't make sense. The purpose of a peer dependency is for when the user is expected to use that library directly, and the library with a peer dependency is a plugin or similar. But in this case, the proto-loader library is an abstraction layer over protobufjs
, so it needs to use it directly, and it expects specific features from specific versions of protobufjs
. I also don't see how different versions of protobufjs
generating different output is a point in favor of making this change. Keeping the dependency as it is ensures that a specific version of protobufjs
will be used when doing this code generation. Any other use of protobufjs
outside of the context of this library is completely irrelevant to the function of this library.
Co-authored-by: Ian Edington <IanEdington@gmail.com>
Index / BarrelsMany of the design choices in this PR, including that one, had the goal of minimizing the chances of any name conflicts, especially between files. If we wanted to add these files, we would need to choose a name that wouldn't conflict with anything else that is getting generated. For example, I think there are potential issues with packages that contain both concrete types and other packages. One other thing to consider is that types for messages or enums defined within other messages don't follow the same naming convention, but you can still access those types if you have to. In my own experience, automatic import generation in an IDE like VSCode makes the scattering of types into different files much less of an issue. Helpful loader typingA similar idea was suggested in a previous comment. I agree that this would be useful but I would prefer to not add any more complexity to this PR, so this change would belong in a future PR. Access to the pbjs RootThis is a strong non-goal of this library. The point is to provide an abstraction over Protobuf.js, so we do not want to expose any Protobuf.js APIs directly. *__Output typesIf you have a suggestion for a better naming convention or a good way to explain them, please share. That's what I came up with and I don't have any better ideas. Comments in the generated code itself would be tricky because they would conflict with the generated comments from the EnumTypeDefinition included even when unusedDoing it that way makes the code generation simpler, but I can change it if it's causing problems. |
@merlinnot I think you are probably right that merging those files is the right solution, but that change is a lot trickier than it appears. First, the current code essentially runs the entire code generation algorithm separately for each input file. There would need to be changes at multiple levels to accommodate merging those files. And second, merging those objects is actually complex; those objects never actually exist in memory: it translates directly from Protobuf.js Namespace objects to generated code, and those Namespace objects wouldn't merge correctly. |
Thanks for the explanation. That's a shame we can't implement it easily, I guess I'll have to live with workarounds for now. |
…/grpc-node into proto-loader_type_generator
@merlinnot It turns out that merging those files was easier than I expected, with an approach that didn't occur to me before. I published a new draft version 0.6.0-pre18 with that change, please check it out. |
So happy to hear that, I'll test it today, tomorrow the latest! |
Ok, it took way less time than I expected. I tested it with moderately complex services (multiple overlapping and non-overlapping services and RPCs) and it worked out of the box. Many thanks, my codebase looks much, much more tidy now! |
I have published this PR in version 0.6.0 of the |
This is an implementation of grpc/proposal#183. This implementation is quick and dirty but I want to have something here so that I can use it.