-
Notifications
You must be signed in to change notification settings - Fork 235
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
L70: @grpc/proto-loader TypeScript Type Generator CLI Tool #183
L70: @grpc/proto-loader TypeScript Type Generator CLI Tool #183
Conversation
- `--oneofs`: Indicates that virtual "oneof" fields will be set to the present field's name in the output | ||
- `--includeDirs=<directories...>`, `-I`: A list of directories to search for included `.proto` files | ||
|
||
- `--outDir`, `-O`: The path in which to output the `.d.ts` file |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if the output is a single .d.ts
file, would it be better to have a regular -o
option for an output file name, dumping it to stdout
by default if the path is not given?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if it's not a single .d.ts
but multiple .d.ts
(e.g. one per service), it's worth mentioning that too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've clarified what files will be generated and in what directory structure.
Will this support the Other ts generators also have this problem. I've made an issue on ts-protoc-gen as well related to this: |
This has no relationship to protoc. This is a generator for |
Ah! I found this via threads related to ts-protoc-gen about how it doesn't support @grpc/grpc-js so I thought it was to support protoc generated code. If this ts support is added, @grpc/proto-loader might be the solution my team needs. Thank you! |
I've gone down this route a couple of times. I've been contemplating writing a custom plugin to do some of it. Some of the rough edges I've hit is around the context generation part. When code generating protos for proto-loader, the provided context (i.e. -I) plays a semi-important role. I needed to overlay the typescript generation with some bash scripts to handle the overall context generation. |
@mjpitz Can you elaborate on the context issues? Is that something that you think isn't well addressed in this proposal, or would benefit from more detail? I have a mostly complete implementation of this proposal at grpc/grpc-node#1474, and it seems to be working pretty well for my usage, which has multiple separate include directories. |
@murgatroid99: Of course! So I think the proposal scopes this out by requiring clients to specify the options block. This might be mitigated by how clients end up using the tool to produce libraries. I know when I've done multi-language projects, managing contexts across each language can be tricky to do. Based on cursory view of this doc, it's not clear to me why we export 3 different types (grpctype, servicehandlers, messages). I would expect this to generate something a little closer to the way other languages generate messages and structures. Last I went down this route, I was trying to base it on some of the manual ones I had written by hand. This was nice for many reasons as it produced similar call signatures to other languages. I'm sure this proposal works but may end up being toilsome for library owners / consumers. For example, I need to unroll the packages twice (one for the messages, one for the handlers, and even once again for registration). From a consumer's perspective, I would expect this to be somewhat captured by the import rather than duplicated in the package. For example, this feels really dry: import { messages } from "./path/to/package/proto";
messages.path.to.package.proto.XXXX vs import { MyMessage } from "./path/to/package/proto"; Again, I feel like a lot of this could be mitigated if owers wrap this in a client library, but it seems like extra work for something a lot of other languages get for free. Does this make sense? |
import { ProtoGrpcType, ServiceHandlers, messages } from 'path/to/proto-file-name_proto'; // File generated by the tool is path/to/proto-file-name_proto.d.ts | ||
import * as protoLoader from '@grpc/proto-loader'; | ||
|
||
const packageDefinition = protoLoader.loadSync('other/path/to/proto-file-name.proto', options); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's quite common for folks to want to load more than one proto file. It would be good to switch this over to an array instead of a simple string to show how multiple types may be loaded.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This API does accept an array of strings as an alternative to a single string. When loading multiple files you can get the type of the output from grpc.loadPackageDefinition
by loading all of the corresponding ProtoGrpcType
interfaces from the corresponding generating files and combining them with &
.
Also, maybe this isn't clear, but the proposed CLI tool can also accept multiple filenames to generate types for.
|
||
``` | ||
|
||
The tool will be named `proto-loader-gen-types` and will have this usage: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will this work as a protoc plugin? Or will it stand on it's own.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a standalone tool, distributed in the @grpc/proto-loader
package. It uses Protobuf.js rather than protoc for parsing.
First, I want to clarify that the runtime exports from the The 4 exports in the example could probably be consolidated into 3, but fewer would be tricky because of name collisions. To be fair, the But the As for the imports and namespaces, my proposal is a little different from what you describe there. With this proposal, you would be able to write import { messages } from './generated/protoFilename';
messages.path.to.package.proto.XXXX but you would actually also be able to write import { MyMessage } from './generated/path/to/package/proto/MyMessage' |
I made some changes to the proposal in response to the recent comments:
|
|
||
The gRPC implementation is specified at build time because the types depend on types from that library and the user should know at that point which implementation they are using. An alternative is to use generics to insert the types in a different way, but that increases the complexity of both the generated code and the code that uses it for relatively little gain. | ||
|
||
Two types are generated for each message definition to represent that at runtime the library is permissive with what it accepts, but stricter with what it outputs. Having those stricter output types allows the user to get tigher type guarantees where possible. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This sentence is hard to read. Do you mean the following:
"Two types are generated for each message definition. To represent the message at runtime, the library (tool?) is .... "
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What I'm trying to say is that the library is permissive about what types it accepts as input, but we can be more specific about what types it outputs and the two different types reflect that.
- `--verbose`, `-v`: Enable various logging output | ||
- `--includeComments`: Include comments from the `.proto` files in the generated files | ||
|
||
The output will be one file for each `message` and `enum` loaded, with file paths based on the package and type names, plus a master file per input file that combines all of those to produce the type that the user will load, as described above. `ProtoGrpcType` types from different files can be intersected to get the type that results from loading those files together at runtime. Messages will have an additional type generated, suffixed with `__Output`, that describes the type of objects that will be output by gRPC, i.e. response messages on the client, and request messages on the server. These "output" message types will be subtypes of the main message type, restricted based on the options that are set. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you expand a bit on the need for stricter types, here or under the rationales section?
Also, I wonder if "_Strict" will work better. Conceivably, I may want to use the strict types for both input and output messages, e.g. to improve the overall type conformity of my codebase.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have reworded the part of the rationales section about the stricter types.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LG overall. Thanks.
|
||
The goal of generating two separate interfaces for each message type is to describe two separate things as narrowly as possible: the objects that users can pass to the library as input, and the objects that the library will output. We have more control over what the library outputs, so we can be more specific in that type. This simplifies handling of messages output by the library, while still allowing the same flexibility when providing input messages that users get with the JavaScript interface. | ||
|
||
For example, all Protobuf 3 fields are optional, so the input type allows the user to omit fields, but with the `defaults` code generation option, we know that the library will always output the default value for omitted fields, so the output type can guarantee that every field will have a value. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LG. Thanks for the update.
Discussion thread: https://groups.google.com/g/grpc-io/c/FwLprMC-5Qc