-
Notifications
You must be signed in to change notification settings - Fork 651
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Protobufjs' codegen support #528
Comments
I'm confused. What does protobufjs has anything to do? Sending metadata is done per call through the sendMetadata API function on the call object. It's not done through any of the generated protobuf code. |
I was digging deeper and it seems like there's something really weird. Maybe the premise of this issue is wrong. At the same time, protobuf.js provides code generation, including TypeScript definitions. Those definitions, however, are pretty much useless - well, at least as long as it comes to service definitions - because the actual calls I'm supposed to write in my code loaded via @grpc/proto-loader are completely different from what's in the generated TypeScript. I'm really confused and, firstly, I'd like someone with more knowledge (@nicolasnoble, can you please help with this?) to confirm whether it's the case or not, and, secondly, to explain what's going on if it's not the case. |
Your rough interpretation is basically correct. While protobufjs CAN generate code for service definition, it can NOT be used for gRPC, because it'd be lacking a lot of features. Instead, we have the proto loader package, as you've found, that is one of the two options to use protobuf with grpc-node. It'll use protobufjs behind the scene to dynamically load the proto files, interpret them, and return service definitions that are suitable for gRPC to use. So, yes, long story short, you can't use grpc-node with protobufjs' service definition directly. This needs to happen by way of the proto loader that'll dynamically massage things properly into a grpc service. The other option is to use the protoc compiler to generate static service definitions that you can then load at runtime. |
Also, please consider reviewing our examples at https://github.com/grpc/grpc/tree/master/examples/node |
Thanks. I think I'll need a way to use the TypeScript message definitions generated by protobuf.js's I think the only way is to create a thirdparty (since protobuf.js would probably not want to adopt it in their codebase) generator utilizing the pbjs or pbts from protobuf.js. Would you be interested in having that as part of UPD: corrected first part to make it more clear. |
So, protobufjs' author (@dcodeIO) has been fairly keen in the past in adapting their project for grpc. Having grpc service definition support in the pbts tool might be something they might consider adopting, or even merging, if you're doing the work for them. On the other hand, the reason we've been hiding protobufjs away using the protoloader in a way that will prevent you from accessing the generated structures directly is done so we have more flexibility with our API. When protobufjs changes API, we don't need to update in lockstep, we can stay behind for a while using the "hiding away" part, and quietly changing the glue code as to avoid changing what we expose. But exposing directly a third party's project API is risky, because you then need to follow their semantic versioning and release schedule. |
One other option would be for pbts to add a more generic plugin interface for generating additional output. Then we could distribute a plugin for generating gRPC server interfaces that we would distributed in the proto loader library. This is what we currently do with the protoc code generator in the grpc-tools package. The downside is that designing and implementing a plugin interface like that would be a lot of development work, and we would still have to write the plugin when it's done. |
@nicolasnoble I'm not sure what kind of API are we talking about here. If protobuf.js changes the way it serializes objects into messages - node-grpc users are affected too. So, if that's what you're worried about - there's already an implicit dependency of that kind. What I'd like to have is a codegen tool that could generate just the service definitions that would describe the structure of how node-grpc does the calls (I mention it should rely on pbts, but actually it's not strictly necessary). That would mean that we'd generate message definition code using pbts, and service definition code using this new tool. If we include it in node-grpc repo - then every project would be responsible of generating the typing for the API that it's in control of. I think it's a perfect separation of concerns for this case. Instead of relying on pbts, we could just implement generator based on the data the proto-loader has access to. It should be enough, since pbts relies on pretty much the same data (it takes the output of pbjs as an input). |
It doesn't really make sense to depend on pbts like that, because nothing that pbts outputs is part of the type definition of a gRPC service. This probably should have been mentioned earlier, but the gRPC service definitions generated by the proto loader library do not expose Protobuf.js message classes directly. Instead, they use plain JavaScript objects that are structurally similar to the corresponding message classes, with variations that depend on the options passed to the proto loader library at load time. So in reality, generating full type definitions for gRPC services would involve creating both a new message type generator and a new service type generator. |
And about the transitive API part, it actually happened in the past already, and this is the reason we had to deprecate grpc.load, which still uses protobufjs 5, since migrating grpc.load to protobufjs 6 would mean a major semantic versioning change for the grpc-node package. Here it means only a major semantic versioning change for the proto-loader package, and only if we actually can't absorb the change in the package's code. So we're very wary of our transitive dependencies now because of this. So, in relevance to what you're suggesting, if there is a change in protobufjs that ripples through the hypothetical package we have in our repo, but that uses pbts for messages, it still means we need to update said code so that it works again with the newer version of pbts. Which is why, as @murgatroid99 is saying, we're massaging the definitions to hide all of that away so it doesn't necessarily mean we have to rush to fix our code due to another package's change. Transitive dependencies are to be dealt with very carefully. |
If you use |
I don't think you're understanding me. You're right that we're using Protobuf.js for serialization, so the protobuf.js API functionally determines what types the gRPC API uses, but they are still different types from the ones that pbts generates, and the transformation between them is not trivial. |
@murgatroid99 are you talking about the types pbts generates for messages? If you're talking about the services - then that's true, they have this rpcImpl stuff that has nothing to do with node-grpc. That's why I'm saying we need custom tool to generate the definition there. If I'm missing something, and when I call I understand that classes generated by pbts are not usable to be passed to node-grpc. But interfaces do. This is a particular example: Are you saying that even the interfaces that pbts generates are not usable to represent arguments that I'd pass to node-grpc client call? I believe what we'd need to generate would be something like this: service ServiceA {
rpc callFoo (RequestType) returns (ResponseType) {}
} import protobufjsGenerated from "bundle";
class ServiceA {
callFoo(request: protobufjsGenerated.IRequestType, metadata: object, options: any, callback: Callback<protobufjsGenerated.IResponseType>): CallEmitter<protobufjsGenerated.IRequestType, protobufjsGenerated.IResponseType>
} Don't mind But |
@nicolasnoble regarding transitive dependencies, it's truly dangerous. |
A better example of what I'd like to see generated. import * as pb from "./rpc"
import * as grpc from "node-grpc"
export interface IMyService {
myMethod(argument: pb.IMyRequest, metadata: grpc.Metadata, options: grpc.CallOpts, callback: grpc.Callback<pb.IMyResponse>): grpc.ClientUnaryCall<pb.IMyRequest, pb.IMyResponse>;
} |
Those message interfaces are more similar to our object types than I had previously realized, but I don't think they are close enough. I believe that those types correspond to what Protobuf.js calls a "valid message", but we use the The actual transformation code in the proto-loader library can be found here. |
@murgatroid99 exactly. If we look deeper, the classes proto-loader uses are in fact defined as follows (a concrete example): https://github.com/dcodeIO/protobuf.js/blob/69623a91c1e4a99d5210b5295a9e5b39d9517554/tests/data/rpc.d.ts#L27-L28 I highlighted the interesting lines. Now I see, this is where I failed to notice that they don't take those Actually, I now realize I know of a case when the interface, generated by pbts, does not correctly correspond to the input accepted by a When passing an enum as a TypeScript unit (effectively a number in the compiled javascript), protobuf.js will expect a string, and the code will bug out. Yeah, that's truly not trivial. |
That's the thing. The user doesn't own the protobufjs dependency, so we're even more safe here. As of now, there can't be any user breakage due to protobufjs because the only front-end package they see and use is the proto-loader one, and this one depends on a very specific version of protobufjs. There's no "version of protobufjs" for the user to update here, and that's the level of segregation we're currently happy with. |
@nicolasnoble oh, I see. User will want to own it if he'll be doing code generation. It's like one the first rules - to take control of the version of the codegen tool. We have protobuf.js version locked for example, otherwise we'd have unwanted changes in the generated code. To summarize the outcome of the overall discussion: seems like we need a better support from protobuf.js side before we can continue with this. I'm still using pbts definitions for typing unary calls with a dirty hack to replace rpcImpl with calls to node-grpc generated client stubs - it works good in 95% of the cases (that's just unary calls; streaming calls are used directly from node-grpc stubs and are untyped, but we only have like 10 our of ~150 calls that are non-unary). |
Actually, I have a mirroring issue for this at protobuf.js since April! protobufjs/protobuf.js#1017 |
Note that if your goal really is to have Typescript, one other possibility would be to contribute to the protobuf project to add Typescript there :-) |
The question now is how do we want it to look like. Additional work at protobuf.js is needed, that's for sure. But I'd not say that I'd like code generation for I'd prefer if there was some kind of type-layer API established between the two projects, rather than one of them to completely take over the generation for both. I'm considering some kind of meta programming approach currently. type UnaryCallDefinition<TReq, TRes> = { req: TReq, res: TRes }
export interface IMyServiceDescriptor {
callFoo: UnaryCallDefinition<IMyRequest, IMyResponse>;
callBar: UnaryCallDefinition<IMyAnotherRequest, IMyAnotherResponse>;
} And on the node-grpc side, proto-loader should make use of those the definitions so that the resulting client stub is properly typed (without codegen - all at runtime, with just the code generated at protobuf.js side). I don't have an example for that yet, but basically for interface IMyService {
callFoo(argument: IMyRequest, metadata: grpc.Metadata, options: grpc.CallOpts, callback: grpc.Callback<IMyResponse>): grpc.ClientUnaryCall<IMyRequest, IMyResponse>;
} Again - that should be possible to extract at runtime from the "adapter" types, generated on protobuf.js. Without the additional code generation. Do you guys like this idea? |
@dcodeIO, please join us if you're interested. |
Generate the corresponding interface to the proto file.
to (typescript)
Additional contextreference: https://docs.nestjs.com/microservices/grpc |
I think I'm having the same issue here. I have a service which I'm passing metadata too and getting back using I've found this library the best to generate a d.ts file for my other services, however, it only generates service definitions with a request and callback and not the optional metadata parameter. |
Any update here? Any reason we can't get a typescript interface generator as a start at least? Any other projects out there that have done this? |
The main reason would be time and resources, really. It's on our radar, however. |
@woodcockjosh Doesn't use protobufjs, but I've used https://github.com/agreatfool/grpc_tools_node_protoc_ts for awhile and its good. It generates typescript definition files corresponding to the js files protoc generates. |
I noticed protobufjs gained some instruction on how to use its static generated code with grpc: protobufjs/protobuf.js@9450f4d do you think that will conclude this discussion? I'm experimenting with it, though I'm getting some type errors like serialize/deserialize methods having invalid type and I'm not sure if bi-di-streams are supported / possible to run. |
@kskalski I think it could conclude this discussion. We are using what is in the doc in production and it works quite well. If you are getting errors, it may be that you are on an older version of node |
Right, I suppose My attempt is something like:
but I got exception like this:
I wonder if anybody else explored bi-di surface area in this direction already. |
I have an update on this: #1474 has a work in progress TypeScript generator for |
Hi @murgatroid99, the update looks promising. I've given it a try and found that the generated ts file contains a bunch of namespaces. It seems TypeScript suggests avoid using namespaces, quote
This article and this answer also prefers modules over namespaces. Any thoughts? |
Also I'm using #1380 to genereate js files for grpc-js, like this:
Which generates In js, exports.FooClient = grpc.makeGenericClientConstructor(FooService); In ts, export namespace ClientInterfaces {
export interface FooClient extends grpc.Client {
...
}
} This is confusing and causing problems, e.g. with Js I can let rpcClient = new FooClient('localhost:50051', grpc.credentials.createInsecure()) But with ts I can't. Am I using it in a wrong way? |
The types generated by the proto-loader type generator are for use with the objects generated by proto loader. They have no relation to the code generated by See grpc/proposal#183 for the design of the proto-loader type generator, including expected usage. |
Regarding the namespace point, I believe that that recommendation applies less to generated code because we have less control over the names of individual objects. Using namespaces allows the exported objects to be in a tree structure that mimics the original package structure from the |
I believe this issue has been resolved with the introduction of the TypeScript code generator in |
Scratch the original issue, this is now about improving TypeScript definitions in general.
Skip ahead to #528 (comment)
Original issue text below.
Is your feature request related to a problem? Please describe.
I can't currently pass metadata (or any per-call option) when using protobuf.js.
Describe the solution you'd like
Add a way to pass metadata at the very least, and potentially any kind of per-call option when using protobuf.js.
This seems to be an integration issue between the two projects, and I don't know where to create an issue for it. I chose this project because it own the concept of metadata.
Technically, I'm proposing, in TypeScript/Flow terms, to change the API and make the client generic to allow customization of the "metadata/arbitrary options" type, and add an argument to every call:
The
rpcImpl
should have theopts
argument passed along with everything else. For the purposes of passing headers, we can start with supportingT
with type like{ headers: Metadata }
on thenode-grpc
end, and injecting headers (if any specified) into unary/streaming call.Describe alternatives you've considered
I considered switching from node.js to some wasm-based solution, like using Go/Rust gRPC libraries. It's rather complicated, and adding support for metadata is a general improvement for everyone, so I decided to create this issue first as see how it goes.
Another solution we considered is just moving away from protobuf.js to google-protobuf. That one is more difficult to use as it operates with custom types for messages instead of using raw JavaScript objects. Also it does not generate TypeScript definitions.
Given the two options I've presented above, I think it's important to do some work on the particular problem of poor protobuf.js/node-grpc compatibility, as it solves huge pain-point (for us at least).
Additional context
We need the ability to pass metadata (headers only so far, no tailers) with our call. We're using TypeScript-generated definitions with protobuf.js, as type definitions are mandatory if our project.
Metadata can only be derived from the context at the moment we perform the call. Creating new client per every call would not be the a good thing in terms of performance.
We run node-grpc native client, servers are not implemented in javascript.
The text was updated successfully, but these errors were encountered: