-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loading deep proto files having "imports" #368
Comments
protobuf.js does not try to resolve files from multiple paths, mostly because it's also meant to run on the web which would require unnecessary 404 error handling. You can, however, use the pbjs command line utility to generate a bundle of everything you need (always recommended for web deployments). It for example accepts a pbjs might also be useful on the command line: Run it with proper |
I initially found this issue when searching this problem. I found my actual solution in a different issue (#70, and it IS in the documentation actually). Instead of (e.g.) FYI, this also works when using the grpc framework which uses protobuf.js, such as |
Closing this for now. Feel free to send a pull request if this is still a requirement. |
I am trying to make a dynamic protobuf rpc server (not generated code or JSON) tool. I have gotten pretty far using grpc, and can parse this into a dynamic server: syntax = "proto3";
package helloworld;
// import "google/api/annotations.proto";
// The greeting service definition.
service Greeter {
// Sends a greeting
rpc sayHello (GreeterRequest) returns (GreeterReply) {
option (google.api.http) = {
post: "/v1/hi"
body: "*"
};
}
// Sends a farewell
rpc sayGoodbye (GreeterRequest) returns (GreeterReply) {
option (google.api.http) = {
post: "/v1/bye"
body: "*"
};
}
}
// The request message containing the user's name.
message GreeterRequest {
string name = 1;
}
// The response message containing the greetings
message GreeterReply {
string message = 1;
} I'd prefer to not have to comment out I can't seem to point protobuffjs at my path: const path = require('path')
const protobuf = require('protobufjs')
protobuf.load({file: path.resolve('example/helloworld.proto'), root: path.resolve('protolib')})
.then(root => {
console.log(root)
})
.catch(e => {
console.error(e)
}) I get no error, but it's an empty Root {
options: undefined,
name: '',
parent: null,
resolved: false,
comment: null,
nested: undefined,
_nestedArray: null,
deferred: [],
files: [] } which is similar to if I put totally wrong paths in |
If there was an option to just ignore imports (to avoid the error that the file is missing) that would work fine for me, too. I am currently using these proto files with that import commented, and they're working fine. This code: const path = require('path')
const protobuf = require('protobufjs')
console.log(path.resolve('protolib'))
protobuf.load(path.resolve('example/helloworld.proto'))
.then(root => {
console.log(root)
})
.catch(e => {
console.error(e)
}) throws
|
There is a very basic mechanism in place to override the default behavior when importing files, like this: var root = new protobuf.Root();
root.resolvePath = function(origin, target) {
// determine the path to load and return it (i.e. use a regex)
};
root.load(...); You could also subclass protobuf.Root instead if that feels cleaner. For example, this is what pbjs does. |
I like the idea of that even better. So I would just run pbjs on the google definitions, then run this?: protobuf.common("google", bigobject) |
Most of the google definitions are already bundled, you see that further down in the common.js file. Regarding pbjs: It already bundles all the google things within generated .json files. You can then use the light build instead of the full build. |
I'm not sure I am following. I tried this: const path = require('path')
const protobuf = require('protobufjs')
protobuf.common('google/api/annotations.proto', {})
protobuf.load(path.resolve('test/test.proto'))
.then(root => {
console.log(root)
})
.catch(e => {
console.error(e)
}) I still get the missing error for |
Actually, I got it: protobuf.common(path.resolve('test/google/api/annotations.proto'), {}) will ignore it. |
The commit above changed the behavior of var root = new protobuf.Root();
root.resolvePath = function(origin, target) {
if (/\/google\/api\//.test(target))
return null; // ignored
return protobuf.util.path.resolve(origin, target); // default
})
root.load( .... ) |
As of today, it still has no work. Why? demo.proto syntax = "proto3";
package demo;
import "google/protobuf/timestamp.proto";
import "google/api/annotations.proto";
// The greeting service definition.
service Greeter {
// Sends a greeting
rpc SayHello (HelloRequest) returns (HelloReply) {
option (google.api.http) = {
post: "/hello"
body: "*"
};
}
}
// The request message containing the user's name.
message HelloRequest {
string message = 1;
}
// The response message containing the greetings
message HelloReply {
string message = 1;
google.protobuf.Timestamp at = 2;
} This code can be loaded normally // demo.js
const protobuf = require("protobufjs")
const root = new protobuf.Root()
const result = root.loadSync('./demo.proto')
result.resolveAll()
console.log(result.toJSON()) But asynchronous loading will cause exceptions: // demo2.js
const protobuf = require("protobufjs")
const root = new protobuf.Root()
root.load('./demo.proto', {}).then(result => {
result.resolveAll()
console.log(result.toJSON())
}) |
Hello,
Other examples of protobuf software seem to have some concept of a base directory where protobufs will be imported from, but I'm having trouble loading proto files in protobuf.js. Whenever I have an import, it seems like protobuf.js expects that proto to be in that directory or a subdirectory.
Specifically, I've been trying to get a Node project up that can talk to Etcd's new v3 protobuf/grpc interfaces. I'm having trouble getting the .loadProto method to load this proto file and it's imports without fairly brutal hackery.
The proto files seem to assume there's a base directory. Rpc.proto is in this
etcdserver/etcdserverpb/rpc.proto
folder, yet it has these imports:import "gogoproto/gogo.proto"; import "etcd/storage/storagepb/kv.proto";
So when I try to load the rpc.proto file, it joins the dir of rpc.proto with the specified, expecting for example a etcdserver/etcdserver/gogoproto/gogo.proto file, whereas Etcd/Go-protobuf-tooling seems cognizant of that gogoproto/ is a top level directory for the project.
I've had to add a bunch of symlinks to my project, such that I have rpc.proto symlinked into a "top-level directory", and a symlink for etcd/storage/storagepb/gogoproto to satisfy etcd/storage/storagepb/kv.proto's duplicate ask for
gogoproto/gogo.proto
.Ideally, I'd like to be able to do loadProto on each dependency in order, and I had a go at that that I thought ought work. I had to add the above mentioned symlinks, and worse, it indicates that it's not me loading the imports ahead of time: now loadProto is searching for imports on the fs, rather than me providing it to it ahead of time. So I'd have to add
etcd/storage/storagepb/gogodeps/gogo.proto
as another explicit, ahead of time import on line 14 in my code, where I enumerate deps.Looking at the data-structures protobuf.js generates, it seems to use full paths for dependencies. It seems like other protobuf tooling has some notion of a top level directory, that it can also use to search for files from. That concept could be added, or trivially/hackierly, perhaps protobuf.js could try recursing up the directory tree, looking for imports until satisfied.
Please let me know if I can help clarify or inform further. I'd really like to get these proto's loaded in protobuf.js, and get on with my little app thing. Thanks for the amazing project dcodeIO &al, quite a fantastic work here.
The text was updated successfully, but these errors were encountered: