Giter Site home page Giter Site logo

timostamm / protobuf-ts Goto Github PK

View Code? Open in Web Editor NEW
949.0 13.0 122.0 18.72 MB

Protobuf and RPC for TypeScript

License: Apache License 2.0

Makefile 0.86% TypeScript 97.04% JavaScript 0.95% HTML 0.06% CSS 0.09% C# 0.94% Go 0.04% Shell 0.03%
twirp grpc-web bigint code-size reflection typescript protobuf custom-options well-known-types strict-conformance

protobuf-ts's Introduction

protobuf-ts npm

Protocol buffers and RPC for Node.js and the Web Browser. Pure TypeScript.

For the following .proto file:

syntax = "proto3";

message Person {
    string name = 1;
    uint64 id = 2;
    int32 years = 3;
    optional bytes data = 5;
}

protobuf-ts generates code that can be used like this:

let pete: Person = {
    name: "pete", 
    id: 123n, // it's a bigint
    years: 30
    // data: new Uint8Array([0xDE, 0xAD, 0xBE, 0xEF]);
};

let bytes = Person.toBinary(pete);
pete = Person.fromBinary(bytes);

pete = Person.fromJsonString('{"name":"pete", "id":"123", "years": 30}')

What are protocol buffers?

Protocol buffers is an interface definition language and binary serialization format.
Data structures defined in .proto files are platform-independent and can be used in many languages.
To learn more about the capabilities, please check the official language guide.

Quickstart

  • npm install @protobuf-ts/plugin

    installs the plugin and the compiler "protoc"

  • download the example file msg-readme.proto and place it into a protos/ directory

  • npx protoc --ts_out . --proto_path protos protos/msg-readme.proto

    generates msg-readme.ts
    if your protoc version asks for it, add the flag "--experimental_allow_proto3_optional"

Features

Read the MANUAL to learn more.

Copyright

Support

Buy Me A Coffee

protobuf-ts's People

Contributors

be9 avatar benmcmorran avatar colinlaws avatar commanderroot avatar daboxu avatar deini avatar dependabot-preview[bot] avatar dependabot[bot] avatar dimo414 avatar erichiggins0 avatar eyalpost avatar gtb3nw avatar hugebdu avatar jamesbirtles avatar jcready avatar kskalski avatar lesomnus avatar mikeylemmon avatar mikonse avatar odashevskii-plaid avatar pedelman avatar qingfenglee avatar sessfeld avatar smaye81 avatar tannera avatar timostamm avatar tpcstld avatar vicb avatar whs avatar wielski avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

protobuf-ts's Issues

prefer system installed protoc

We have a monorepo and use protoc for more than our web client which is why it's installed and configured in our development container systemwide. As such, the protoc installed into our node_modules/.bin folder from @protobuf-ts/plugin is getting priority in our build scripts and is missing a bunch of includes we rely on. Is there a way to disable installing this dependency and/or have a config option to prefer the system-wide installed version?

our "build" scripts are scripts defined in our package.json

[BUG] The `generate_dependencies` parameter doesn't work

When debugging the issue I was able to narrow it down to this method:

isTypeUsed(type: AnyTypeDescriptorProto, inFiles: FileDescriptorProto[]): boolean {
let used = false;
for (let fd of inFiles) {
this.tree.visitTypes(fd, typeDescriptor => {
if (used) return;
if (DescriptorProto.is(typeDescriptor)) {
const usedInField = typeDescriptor.field.includes(type);
if (usedInField) {
used = true;
}
} else if (ServiceDescriptorProto.is(typeDescriptor)) {
const usedInMethodInput = typeDescriptor.method.some(md => this.nameLookup.resolveTypeName(md.inputType!) === type);
const usedInMethodOutput = typeDescriptor.method.some(md => this.nameLookup.resolveTypeName(md.outputType!) === type);
if (usedInMethodInput || usedInMethodOutput) {
return true;
}
}
})
}
return used;
}

I was able to resolve the issue by changing the method to do the following:

 isTypeUsed(type: AnyTypeDescriptorProto, inFiles: FileDescriptorProto[]): boolean {
     // Probably not the right way to get this
     const normalizedTypeName = this.nameLookup._reverse.get(type);
     let used = false;
     for (let fd of inFiles) {
         if (used) return used;
         this.tree.visitTypes(fd, typeDescriptor => { 
             if (used) return;
             if (DescriptorProto.is(typeDescriptor)) {
                 const usedInField = typeDescriptor.field
                     .map(f => f.typeName)
                     .filter(Boolean)
                     .map(this.nameLookup.normalizeTypeName)
                     .includes(normalizedTypeName);
                 if (usedInField) {
                     used = true;
                 }
             } else if (ServiceDescriptorProto.is(typeDescriptor)) { 
                 const usedInMethodInput = typeDescriptor.method.some(md => this.nameLookup.resolveTypeName(md.inputType!) === type); 
                 const usedInMethodOutput = typeDescriptor.method.some(md => this.nameLookup.resolveTypeName(md.outputType!) === type); 
                 if (usedInMethodInput || usedInMethodOutput) { 
                     used = true; 
                 } 
             } 
         }) 
     } 
     return used; 
 } 

I'd be happy to open a PR if you think this is the correct way to resolve the issue.

Ideas

This is a collection of ideas for further development:

Test error in test-generated

Timo,

We have discussed that earlier if I remember correctly.

I am creating an issue because I always get an error when running make in the root directory (Linux, node 14.4) of the master branch without any local change.

The error is:

@protobuf-ts/test-generated: make clean generate-speed-bigint test-spec-bigint test-conformance-bigint
@protobuf-ts/test-generated: 'clean' done
@protobuf-ts/test-generated: 'generate-speed-bigint' done
@protobuf-ts/test-generated: Jasmine started
@protobuf-ts/test-generated:   generated client style promise
@protobuf-ts/test-generated:     duplex-streaming
@protobuf-ts/test-generated:       ✗ duplex-streaming
@protobuf-ts/test-generated:         - Unhandled promise rejection: RpcError: user cancel
@protobuf-ts/test-generated:         Code: CANCELLED
@protobuf-ts/test-generated:         /Users/berchet/Code/protobuf-ts/packages/test-generated/ts-out/service-style-promise.ts:134:27
@protobuf-ts/test-generated:                 if (opt.abort) {
@protobuf-ts/test-generated:                     opt.abort.addEventListener("abort", () => {
@protobuf-ts/test-generated:                         const e = new RpcError("user cancel", "CANCELLED");
@protobuf-ts/test-generated:                                   ~
@protobuf-ts/test-generated:                         outputs.notifyError(e);
@protobuf-ts/test-generated:                         if (inputs.throw)
@protobuf-ts/test-generated:         /Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/event-target-shim/src/event-target.mjs:337:35
@protobuf-ts/test-generated:         jasmine-spec-reporter: unable to open '/Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/event-target-shim/src/event-target.mjs'
@protobuf-ts/test-generated:         Error: ENOENT: no such file or directory, open '/Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/event-target-shim/src/event-target.mjs'
@protobuf-ts/test-generated:         /Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/abort-controller/src/abort-signal.ts:68:12
@protobuf-ts/test-generated:         jasmine-spec-reporter: unable to open '/Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/abort-controller/src/abort-signal.ts'
@protobuf-ts/test-generated:         Error: ENOENT: no such file or directory, open '/Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/abort-controller/src/abort-signal.ts'
@protobuf-ts/test-generated:         /Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/abort-controller/src/abort-controller.ts:26:9
@protobuf-ts/test-generated:         jasmine-spec-reporter: unable to open '/Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/abort-controller/src/abort-controller.ts'
@protobuf-ts/test-generated:         Error: ENOENT: no such file or directory, open '/Users/berchet/Code/protobuf-ts/packages/test-generated/node_modules/abort-controller/src/abort-controller.ts'
@protobuf-ts/test-generated:         /Users/berchet/Code/protobuf-ts/packages/test-generated/spec/generated-client-style-promise.spec.ts:295:23
@protobuf-ts/test-generated:                     const abort = new AbortController();
@protobuf-ts/test-generated:                     setTimeout(() => {
@protobuf-ts/test-generated:                         abort.abort();
@protobuf-ts/test-generated:                               ~
@protobuf-ts/test-generated:                     }, 5)
@protobuf-ts/test-generated:                     let inputError: any = undefined;
@protobuf-ts/test-generated:         internal/timers.js:549:17
@protobuf-ts/test-generated:         jasmine-spec-reporter: unable to open 'internal/timers.js'
@protobuf-ts/test-generated:         Error: ENOENT: no such file or directory, open 'internal/timers.js'
@protobuf-ts/test-generated:         internal/timers.js:492:7
@protobuf-ts/test-generated:         jasmine-spec-reporter: unable to open 'internal/timers.js'
@protobuf-ts/test-generated:         Error: ENOENT: no such file or directory, open 'internal/timers.js'

It is not clear to me what is the root cause of the error.
I can see that this error is not triggered on the CI.

I'll try to investigate more later.

For now we could keep that issue opened to gather more feedback ?

Running into import error when using generated code in React app

Hello @timostamm! Thanks for putting together such a wonderful tool. I've been tinkering with the code and I'm running into a peculiar import error I've been struggling to debug. My goal is to be able to create functional client / interfaces from proto files against a server running scalapb https://scalapb.github.io/json.html.

./src/utils/proto/google/protobuf/wrappers.ts
Attempted import error: 'LongType' is not exported from '@protobuf-ts/runtime'.

I have a pretty specific set up im testing, so next steps are to work on isolating the issue further, but I thought I would ask and see if this is something you have experience with.

Basically my process was to run protobuf-ts against my interfaces monorepo. Once I was happy with the configuration and generated code, I proceeded to copy the generated files into a local project in src/utils/proto.

npx protoc --ts_out protobuf-ts --ts_opt long_type_string --ts_opt disable_service_client --ts_opt generate_dependencies --proto_path proto ...

In the package json, I have the following:

    "@protobuf-ts/plugin": "^1.0.3",
    "@protobuf-ts/runtime": "^1.0.3",

I verified in node_modules that I see LongType is exported from runtime types.

$ cat node_modules/\@protobuf-ts/runtime/build/types/index.d.ts

...
export { ScalarType, LongType, RepeatType, MessageInfo, EnumInfo, FieldInfo, PartialFieldInfo, normalizeFieldInfo, readFieldOptions } from './reflection-info';
...

I notice that in src/utils/proto/google/protobuf/wrappers.ts, the LongType import is also the first import, so I doubt this has anything to do specifically with that type.

Some info on node / yarn version.

$ node -v
v12.18.3

$ npm -v
6.14.6

$ yarn -v
1.21.1

If I find the solution, I will be sure to update this. Thanks again!

"Sorry, protobuf-ts was unable to install protoc..." when using older node version

install.js uses the {recursive: true} option to fs.mkdirSync, which was only introduced in node 10.12.0. If you're using a version older than that, the option is silently ignored and you get:

Sorry, protobuf-ts was unable to install protoc...
ENOENT: no such file or directory, mkdir '/path/to/node_modules/@protobuf-ts/protoc/installed/protoc-3.13.0-linux-x86_64/include/google/protobuf'

It would probably be worthwhile to emit a more informative error message if you detect an old version of node.

Automate dependencies update ?

There are a few vulnerabilities when deps are installed:

$ make
npm i
audited 748 packages in 2.553s

24 packages are looking for funding
  run `npm fund` for details

found 25 vulnerabilities (4 low, 21 high)
  run `npm audit fix` to fix them, or `npm audit` for details

It would be nice to update dependencies automatically, i.e. using renovate. You can set it update to update once in a while (i.e. monthly) to avoid too much churn.

Runtime TypeError getBigUint64 is not a function using Safari 14

Hi @timostamm, I have recently updated to Safari 14 and noticed the following error in my application which appears to be coming from the runtime.

return VIEW64.getBigUint64(0, true);

Unhandled Promise Rejection: TypeError: O.getBigUint64 is not a function. (In 'O.getBigUint64(0,!0)', 'O.getBigUint64' is undefined)

Surprisingly, I didnt see this error in Safari 13, despite this method apparently not being supported. Im guessing the error was silently ignored?

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView#Browser_compatibility

I did find some discussion on this topic already and it looks like it wouldnt be too bad to write a polyfill using JSBI.

System information

  • macOS: 10.15.7
  • Safari: Version 14.0 (15610.1.28.1.9, 15610)

Let me know if there is any other information I can provide! Thanks again for putting together this project!

[Question] Dynamic/Runtime support

Hey Timo,

I have a question for you:

protobufjs supports dynamic code creation to create proto in the runtime from a proto file or a JSON descriptor using protobuf.load().

I wonder:

  • if it would be possible to implement this in protobuf-ts,
  • if yes how complex it would ?

I can think of 2 ways to do this:

  1. Embed the TS compiler and generate ES6 to be eval'd - still we would need a way to parser the proto,
  2. Generate the FieldInfo array from the proto file of JSON descriptor. When generating from the proto file there would need to be a runtime proto parser too.

Do you have any thoughts to share about this ?

Generate clients for @grpc/grpc-js

As an alternative to generic service clients and the gRPC transport, it might be beneficial to generate a @grpc/grpc-js client.

This would allow users to use all features provided by @grpc/grpc-js.
It should ease migration from protobuf-js protobuf-ts.

TODO

  • manually setup a client to see what needs to be generated - todo-generate.ts
  • should server stream also accept callback? -> no package grpc-tools doesn't either
  • create code generator
    • honor reserved names
    • use plugin parameter "client_grpc1"
  • create example project
  • split GrpcOptions and manage client life cycle, see #107
  • documentation (MANUAL.md)

Questions on the generated code ?

Is there anything preventing using

{ no: 2, name: 'lat', kind: 'scalar', repeat: RepeatType.PACKED, T: ScalarType.SINT32 },

instead of

{ no: 2, name: 'lat', kind: 'scalar', repeat: 1 /*RepeatType.PACKED*/, T: 17 /*ScalarType.SINT32*/ },

edit:

Oh I guess it's because non-const enum would not be replaced by a value and consumes more bytes.

Could we do something similar for the type field ?

ie:

const typeScalar = `scalar`;
const typeMap = `Map`;
[...]

{ no: 2, name: 'lat', kind: typeScalar, repeat: 1 /*RepeatType.PACKED*/, T: 17 /*ScalarType.SINT32*/ },

the consts type... would get minified which would save bytes... but not sure of the impact after gzip.

Maybe an other idea would be to create locale (i.e. not exported) const enum in this file - I am not sure if this would work with react though ?

Yet an other idea would be to make { no: 2, name: 'lat', kind: typeScalar, repeat: 1 /*RepeatType.PACKED*/, T: 17 /*ScalarType.SINT32*/ } a Tuple.


An other possible optimization could be to skip the no if equal to the index (+1) in the table, i.e.

    super('Track', [
      { no: 1, ...},
      { no: 2, ...},
      { no: 3, ...},
      { no: 4, ...},
      { no: 5, ...},
    ]);

// could be rewritten as

    super('Track', [
      {...},
      { ...},
      {...},
      {...},
      {...},
    ]);

the indexes could be auto filled here - not a huge saving though

Support rxjs Observable and Promise as return types for RPC client methods

Right now, using RPC requires working with the types UnaryCall, ServerStreamingCall.

Unary example (proto):

let call = client.makeHat({ inches: 23 });

let headers: RpcMetadata = await call.headers;
let response: Hat = await call.response; 
let trailers: RpcMetadata = await call.trailers;
let status: RpcStatus = await call.status;

Server-streaming example (proto):

let call = client.serverStream(AllMethodsRequest.create());

let headers: RpcMetadata = await call.headers;
for await (let msg: AllMethodsResponse of call.response) {
}
let trailers: RpcMetadata = await call.trailers;
let status: RpcStatus = await call.status;

The types are designed to allow for all gRPC features. For example, it is valid for a server to send message(s), then an error status (and the client should always check the status).

In most cases, however, this complexity is not necessary. There are some facilities to make it easier to use for standard use cases:

// waits until the entire call is finished (and status is okay)
let {response} = await client.makeHat({ inches: 23 });

But in the end, this is too complicated. Most of the time, I just want a response, and having to read up on the semantics is not a good DX.

I would like to have an option to have the methods return well-known async types Observable (rxjs) and Promise:

  • Example for unary method with Promise return type:
    makeHat(request: Size, options?: RpcOptions): Promise<Hat>
  • Unary with Observable:
    makeHat(request: Size, options?: RpcOptions): Observable<Hat>
  • Server-streaming with Promise:
    makeManyHats(request: Size, options?: RpcOptions): AsyncIterable<Hat>
  • Server-streaming with Observable:
    makeManyHats(request: Size, options?: RpcOptions): Observable<Hat>

#7 paves the way for a new service option:

service Haberdasher {
    // change the method signatures of this service client to use rxjs Observable:
    (ts.rpc_style) = RXJS;

    rpc MakeHat (Size) returns (Hat);
}

There should probably also be a plugin parameter to change the default the style.

Still pondering whether it should be possible to generate multiple styles in one go...

Generic server support

For clients, we generate a generic interface / implementation that is agnostic to the protocol.
The actual protocol implementation is delegated to a "transport".

A similar concept can be applied to servers. We generate a generic interface for a server, and provide the protocol in "backend" packages.

Proof of concept:

TODO

  • come up with sensible method signatures
  • adapter for unary
  • adapter for server streaming
  • adapter for client streaming
  • adapter for bidi streaming
  • check cancellation support
  • test deadlines
    • client-side, grpc-transport #77
    • server-side, grpc-backend
  • test error handling
  • fix UnhandledPromiseRejectionWarning when cancelling client call, see client-cancel-test.ts
  • generate generic server interfaces (proto-file-name.server.ts)
  • add plugin parameter "server_generic"
  • add server style GENERIC
  • example package
  • documentation (example-chat-system, MANUAL.md)

gRPC server generation doesn't work as documented

First I tried using the instructions here: https://github.com/timostamm/protobuf-ts/blob/master/MANUAL.md#grpc-server

Got the following error:

npx protoc --ts_opt server_grpc1,client_none,generate_dependencies,optimize_code_size --proto_path proto --ts_out ./src/grpc/gen proto/*.proto
@protobuf-ts/protoc installed protoc v3.14.0.
--ts_out: ParameterError

Parameter "server_grpc1" not recognized.

Then I noticed the packages in the example are the alpha packages. I installed them then got this error:

npx protoc --ts_opt server_grpc1,client_none,generate_dependencies,optimize_code_size --proto_path proto --ts_out ./src/grpc/gen proto/*.proto
@protobuf-ts/protoc installed protoc v3.14.0.
--ts_out: ParameterError

Parameter "server_grpc1" not recognized.

Available parameters:

- "long_type_string"
  Sets jstype = JS_STRING for message fields with 64 bit integral values. 
  The default behaviour is to use native `bigint`. 
  Only applies to fields that do *not* use the option `jstype`.

- "long_type_number"
  Sets jstype = JS_NUMBER for message fields with 64 bit integral values. 
  The default behaviour is to use native `bigint`. 
  Only applies to fields that do *not* use the option `jstype`.

- "long_type_bigint"
  Sets jstype = JS_NORMAL for message fields with 64 bit integral values. 
  This is the default behavior. 
  Only applies to fields that do *not* use the option `jstype`.

- "generate_dependencies"
  By default, only the PROTO_FILES passed as input to protoc are generated, 
  not the files they import. Set this option to generate code for dependencies 
  too.

- "client_none"
  Do not generate rpc clients. 
  Only applies to services that do *not* use the option `ts.client`. 
  If you do not want rpc clients at all, use `force_client_none`.

- "client_call"
  Use *Call return types for rpc clients. 
  Only applies to services that do *not* use the option `ts.client`. 
  Since CALL is the default, this option has no effect.

- "client_promise"
  Use Promise return types for rpc clients. 
  Only applies to services that do *not* use the option `ts.client`.

- "client_rx"
  Use Observable return types from the `rxjs` package for rpc clients. 
  Only applies to services that do *not* use the option `ts.client`.

- "force_client_none"
  Do not generate rpc clients, ignore options in proto files.

- "enable_angular_annotations"
  If set, the generated rpc client will have an angular @Injectable() 
  annotation and the `RpcTransport` constructor argument is annotated with a 
  @Inject annotation. For this feature, you will need the npm package 
  '@protobuf-ts/runtime-angular'.

- "server_none"
  Do not generate rpc servers. 
  This is the default behaviour, but only applies to services that do 
  *not* use the option `ts.server`. 
  If you do not want servers at all, use `force_server_none`.

- "server_generic"
  Generate a generic server interface. Adapters be used to serve the service, 
  for example @protobuf-ts/grpc-backend for gRPC. 
  Only applies to services that do *not* use the option `ts.server`.

- "server_grpc"
  Generate a server interface and definition for use with @grpc/grpc-js. 
  Only applies to services that do *not* use the option `ts.server`.

- "force_server_none"
  Do not generate rpc servers, ignore options in proto files.

- "optimize_speed"
  Sets optimize_for = SPEED for proto files that have no file option 
  'option optimize_for'. Since SPEED is the default, this option has no effect.

- "optimize_code_size"
  Sets optimize_for = CODE_SIZE for proto files that have no file option 
  'option optimize_for'.

- "force_optimize_code_size"
  Forces optimize_for = CODE_SIZE for all proto files, ignore file options.

- "force_optimize_speed"
  Forces optimize_for = SPEED for all proto files, ignore file options.

I changed to the server_grpc option and it worked, just leaving the note about the docs.

Generate clients into separate files

At the moment, service clients are generated into a single file.

For example, the following .proto:

syntax = "proto3";
package spec;
import "protobuf-ts.proto";
import "google/protobuf/wrappers.proto";

service AllStyleService {
    option (ts.client) = CALL;
    option (ts.client) = RX;
    option (ts.client) = PROMISE;

    rpc Unary (google.protobuf.StringValue) returns (google.protobuf.Int32Value);
    rpc ServerStream (google.protobuf.StringValue) returns (stream google.protobuf.Int32Value);
    rpc ClientStream (stream google.protobuf.StringValue) returns (google.protobuf.Int32Value);
    rpc Bidi (stream google.protobuf.StringValue) returns (stream google.protobuf.Int32Value);
}

Generates a typescript file service-style-all.ts:

export const AllStyleService = new ServiceType("spec.AllStyleService", [
    { name: "Unary", options: {}, I: StringValue, O: Int32Value },
    { name: "ServerStream", serverStreaming: true, options: {}, I: StringValue, O: Int32Value },
    { name: "ClientStream", clientStreaming: true, options: {}, I: StringValue, O: Int32Value },
    { name: "Bidi", serverStreaming: true, clientStreaming: true, options: {}, I: StringValue, O: Int32Value }
]);

export interface IAllStyleServiceCallClient { /* ... */ }
export class AllStyleServiceCallClient implements IAllStyleServiceCallClient, ServiceInfo { /* ... */ }

export interface IAllStyleServiceRxClient { /* ... */ }
export class AllStyleServiceRxClient implements IAllStyleServiceRxClient, ServiceInfo { /* ... */ }

export interface IAllStyleServicePromiseClient { /* ... */ }
export class AllStyleServicePromiseClient implements IAllStyleServicePromiseClient, ServiceInfo { /* ... */ }

This has the following disadvantages:

  • client names have to be suffixed to prevent name clashes
  • removing / adding a second client style will change client names, breaking imports in other code
  • if we start generating servers (#52), the file will have even more symbols

Possible solution:

  • generate individual files for each client style
  • CALL => service-style-all.client.ts
  • RX => service-style-all.rx-client.ts
  • PROMISE => service-style-all.promise-client.ts
  • keep same symbol name for the client: AllStyleServiceClient

TODO:

  • generate individual files for each client style
  • fix collision of ServerStyle.NONE and ClientStyle.NONE
  • prevent file name clashes
  • refactor for better readability

Running on platforms with inofficial protoc builds (e.g. Raspberry Pi)

Hi!
I wrote a small home automation tool using protobuf-ts on my Mac, where it works great, and I love the compactness and ease of use of the library. Thank you for writing it!

Now that I'm trying to deploy it to my Raspberry PI (32-bit), I'm running into an issue in that the protobuf owners doesn't seem to be publishing official binaries for that platform. Instead, I use protoc by simply installing the protobuf compiler apt package. I assume this is built by someone else, but seems to work fine. Here is the precise version of protoc that I'm installing that way:

$ file /usr/bin/protoc
/usr/bin/protoc: ELF 32-bit LSB executable, ARM, EABI5 version 1 (SYSV), dynamically linked

Sadly, though, protobuf-ts insists on downloading protoc from the release repository, or at least I haven't found a way to circumvent this. Is there an "official" way to do this, or should I find a way to hack around it?

Thanks!
Dominik

Plugin wide option to generate int64 as number

There is currently no plugin wide option to set the generated int64 type to number.

See protobufts-plugin.ts: BIG_INT and STRING are the only possible values.

Should there be a long_type_number option ? (and maybe as long_type_bigint ?)

Notes:

  • it is still possible to override this at field level,
  • I can not use BigInt because AppEngine is still on Node 12.

edit: I'm happy to work on a PR once with agree on the option names.

Support message options

Custom field options are added to the generated code as JSON. Same for service options and method options.

But message options are missing.

It should be straight-forward to add this feature. MessageType should have an "options" property just like ServiceType.

sourcemaps in npm release reference nonexistent files

Webpack outputs warnings like this:

WARNING in ./node_modules/@protobuf-ts/runtime/build/es2015/assert.js
Module Warning (from ./node_modules/source-map-loader/dist/cjs.js):
Failed to parse source map from '/tmp/example/node_modules/@protobuf-ts/runtime/src/assert.ts' file: Error: ENOENT: no such file or directory, open '/tmp/example/node_modules/@protobuf-ts/runtime/src/assert.ts'
 @ ./node_modules/@protobuf-ts/runtime/build/es2015/index.js 41:0-89 41:0-89 41:0-89 41:0-89 41:0-89 41:0-89

./node_modules/@protobuf-ts/runtime/build/es2015/assert.js contains "sources":["../../src/assert.ts"]. "../../src/assert.ts" does not exist in the npm release.

Similar warnings are output for the other files, too. Other sourcemaps have the same problem of referencing nonexistent files.

throw `Error: ENOENT: no such file or directory, stat '...'` when using Vite

First thanks for your great work, it is very easy to use!

My only problem is when I start a Vite server, there will throw many Error: ENOENT: no such file or directory, stat '...' errors.

I found the reason is because the npm publish has include .map files and the mapping .ts files do not include.

I see you have add .js.map to .gitignore now, but not enough. if "sourceMap": true is set, there will add a line like //# sourceMappingURL=assert.js.map to .js files, so there still has .map files not found errors, you must to disable sourceMap option for publish.

Is there a problem with the incoming type or protobuf-ts's support for bytes?

When parsing the proto file, when the field type is bytes, there is a problem in calculating the data length in the generated code.

my proto file:

syntax="proto3";

package com.engine.input;

option java_package = "com.engine.input";
option java_multiple_files=false;
option java_outer_classname="AcgInputAvatar";

//客户端到服务端的上行消息
message AcgAvatarInputMsg
{
  string clientTime = 1;
  bytes inputData = 2;
}

Created code:

// @generated by protobuf-ts 1.0.11 with parameters long_type_string
// @generated from protobuf file "AcgInputAvatar.proto" (package "com.engine.input", syntax proto3)
// tslint:disable
import { BinaryWriteOptions } from '@protobuf-ts/runtime';
import { IBinaryWriter } from '@protobuf-ts/runtime';
import { WireType } from '@protobuf-ts/runtime';
import { BinaryReadOptions } from '@protobuf-ts/runtime';
import { IBinaryReader } from '@protobuf-ts/runtime';
import { UnknownFieldHandler } from '@protobuf-ts/runtime';
import { PartialMessage } from '@protobuf-ts/runtime';
import { reflectionMergePartial } from '@protobuf-ts/runtime';
import { MessageType } from '@protobuf-ts/runtime';
/**
 * 客户端到服务端的上行消息
 *
 * @generated from protobuf message com.engine.input.AcgAvatarInputMsg
 */
export interface AcgAvatarInputMsg {
    /**
     * @generated from protobuf field: string clientTime = 1;
     */
    clientTime: string; // 时间戳
    /**
     * @generated from protobuf field: bytes inputData = 2;
     */
    inputData: Uint8Array; // 操作指令
}
/**
 * Type for protobuf message com.engine.input.AcgAvatarInputMsg
 */
class AcgAvatarInputMsg$Type extends MessageType<AcgAvatarInputMsg> {
  constructor() {
    super('com.engine.input.AcgAvatarInputMsg', [
      { no: 1, name: 'clientTime', kind: 'scalar', T: 9 /* ScalarType.STRING */ },
      { no: 2, name: 'inputData', kind: 'scalar', T: 12 /* ScalarType.BYTES */ },
    ]);
  }

  create(value?: PartialMessage<AcgAvatarInputMsg>): AcgAvatarInputMsg {
    const message = { clientTime: '', inputData: new Uint8Array(0) };
    if (value !== undefined) reflectionMergePartial<AcgAvatarInputMsg>(this, message, value);
    return message;
  }

  internalBinaryRead(reader: IBinaryReader, length: number, options: BinaryReadOptions, target?: AcgAvatarInputMsg): AcgAvatarInputMsg {
    const message = target ?? this.create(); const
      end = reader.pos + length;
    while (reader.pos < end) {
      const [fieldNo, wireType] = reader.tag();
      switch (fieldNo) {
        case /* string clientTime */ 1:
          message.clientTime = reader.string();
          break;
        case /* bytes inputData */ 2:
          message.inputData = reader.bytes();
          break;
        default:
          const u = options.readUnknownField;
          if (u === 'throw') throw new globalThis.Error(`Unknown field ${fieldNo} (wire type ${wireType}) for ${this.typeName}`);
          const d = reader.skip(wireType);
          if (u !== false) (u === true ? UnknownFieldHandler.onRead : u)(this.typeName, message, fieldNo, wireType, d);
      }
    }
    return message;
  }

  internalBinaryWrite(message: AcgAvatarInputMsg, writer: IBinaryWriter, options: BinaryWriteOptions): IBinaryWriter {
    /* string clientTime = 1; */
    if (message.clientTime !== '') writer.tag(1, WireType.LengthDelimited).string(message.clientTime);
    /* bytes inputData = 2; */
    if (message.inputData.length) writer.tag(2, WireType.LengthDelimited).bytes(message.inputData);
    const u = options.writeUnknownFields;
    if (u !== false) (u == true ? UnknownFieldHandler.onWrite : u)(this.typeName, message, writer);
    return writer;
  }
}
export const AcgAvatarInputMsg = new AcgAvatarInputMsg$Type();

The inputData is of the bytes data type when it is defined, so the inputData passed in when I execute acgAvatarInputMsg.toBinary() uses the Uint8Array type.

The buffer does not have the length attribute, only the byteLength attribute.
Similarly, the same problem exists in the code of @protobuf-ts/runtime. The length property is also accessed in the BinaryWriter.finish() method instead of byteLength.

Is there a problem with the incoming type or protobuf-ts's support for bytes?

Can't use generated models as type option in Vue props

I feel like this is probably due to my lack of knowledge of TypeScript (and/or this package), but I can't figure out how to satisfy the following with the generated types from this package:

declare type PropConstructor<T> = {
    new (...args: any[]): T & object;
} | {
    (): T;
} | {
    new (...args: string[]): Function;
};

The closest I got to satisfying the compiler is:

import { ReportOption } from '@company/api-client';
//  ...
 props: {
    value: {
      required: true,
      type: () => ReportOption.create(),
    },
  },

But i'm greeted with the following error at runtime:

Invalid prop: type check failed for prop "value". Expected Type, got Object

I understand from the docs that the exported "model" is actually an instance implementing the IMessageType interface, but I'm lost on how to get this working.

RangeError in google.protobuf.Timestamp.fromDate()

The custom fromDate method of the well-known type Timestamp throws a RangeError when the Date has non-zero milliseconds.

Occurs on Google Chrome, Version 68.0.4240.80 (x86_64), mac os.

Reproduction:

Timestamp.fromDate(new Date(2020, 0, 1, 12, 30, 59, 600))
          ^^^^^^^^
core.js:6241 ERROR RangeError: The number 1577878259.6 cannot be converted to a BigInt because it is not an integer

Support Enum / Enum value options

Protobuf has enum options. For example:

extend google.protobuf.EnumOptions {
    bool enum_opt_bool = 2001;
}

extend google.protobuf.EnumValueOptions {
    bool enum_value_opt_bool = 3001;
}


enum AnnotatedEnum {

    option (spec.enum_opt_bool) = true;

    UNSPECIFIED = 0;
    FOO = 1 [(spec.enum_value_opt_bool) = true];
}

These options should be made available in the generated code as JSON, just like field and service / method options already are.

Implementation is probably not straight-forward because there is no good place to put the data. One approach could be to use symbols to add the options to a hidden property on the TypeScript enum object.

gRPC server support for @grpc/grpc-js

With @grpc/grpc-js, it is pretty easy to implement a server.

We should generate a compatible service interface and service definition.

It might be worth it to generate generic service interfaces and use an adapter for grpc, twirp, etc. At least keep the door open for this.

TODO

  • generate an server-interface for the service that is compatible with @grpc/grpc-js / UntypedServiceImplementation
  • generate a service definition that is compatible with @grpc/grpc-js / ServiceDefinition
  • add a plugin parameter / service option that enables server code generation
  • example package
  • README and MANUAL

Any way to use the official grpc package or @grpc/grpc-js as the Transport?

Hi, I'm currently using the protoc-gen-grpc CLI to generate Typescript type definitions from .proto files.

I just checked the output generated by protobuf-ts and it's much better, it even keeps the comments, which is awesome!

However, I could not find a way to use it with either grpc or @grpc/grpc-js as the transport layer. Is this possible?

With protoc-gen-grpc I can simply call any client like the following:

const client = new SomethingCustomService(grpcEndpoint, credentials.createInsecure());
const apiReq = new SomethingCustomTriggerCallRequest();
apiReq.setField('abc')
client.somethingCustomTriggerCall(apiReq, (err, resp) => {})

And I wanted to do something similar with this library.

RpcError: invalid base64 string.

it communicates with the server and it works but I get this error,
how can i solve i'm using react with typescript ?

invalid base64 string.
RpcError rpc-error.js:6
unary grpc-web-transport.js:163
promise callback*unary grpc-web-transport.js:160
tail rpc-interceptor.js:10
stackIntercept rpc-interceptor.js:15
welcome contact.ts:516
callGrpcWelcomeService index.tsx:31
onClick index.tsx:72
React 14
unstable_runWithPriority scheduler.development.js:646
React 15
tsx index.tsx:7
tsx main.chunk.js:8393
Webpack 7

Generated code for "google/protobuf/any.proto"

The generated client code for a proto file with import "google/protobuf/any.proto"; imports a file that does not exist. How do I make the any file generate as part of the generated code?

Support promises in interceptors?

Is there a way to accept promises in interceptors? The use case I have is I plan on using auth0-spa-js which will handle retrieving/refreshing Access Tokens as necessary, but does so with a Promise callback. I'd like my interceptor to use this directly, but for now I've gotten around it by "caching" the token myself and setting a timer to run every minute to update the cached value directly from the auth0 client.

[BUG] grpc-transport does not pass along provided deadline option

Considering this callOpts part is commented out in each of the call types I assume this was thought about at one point. It is clear from trying to debug this by passing along the full input options parameter that that solution won't work. It seems like picking individual keys out of the passed options is the only way this will work.

const gCall = client.makeUnaryRequest<I, O>(
`/${method.service.typeName}/${method.name}`,
(value: I): Buffer => Buffer.from(method.I.toBinary(value, opt.binaryOptions)),
(value: Buffer): O => method.O.fromBinary(value, opt.binaryOptions),
input,
gMeta,
// callOpts,
(err, value) => {

const gCall = client.makeServerStreamRequest<I, O>(
`/${method.service.typeName}/${method.name}`,
(value: I): Buffer => Buffer.from(method.I.toBinary(value, opt.binaryOptions)),
(value: Buffer): O => method.O.fromBinary(value, opt.binaryOptions),
input,
gMeta,
// callOpts
);

gCall = client.makeClientStreamRequest<I, O>(
`/${method.service.typeName}/${method.name}`,
(value: I): Buffer => Buffer.from(method.I.toBinary(value, opt.binaryOptions)),
(value: Buffer): O => method.O.fromBinary(value, opt.binaryOptions),
gMeta,
// callOpts,
(err, value) => {

Server streaming not implemented

Error: NOT IMPLEMENTED
      at GrpcTransport.serverStreaming (/root/node_modules/@protobuf-ts/grpc-transport/build/commonjs/grpc-transport.js:86:15)
      at tail (/root/node_modules/@protobuf-ts/runtime-rpc/build/commonjs/rpc-interceptor.js:21:49)
      at Object.stackIntercept (/root/node_modules/@protobuf-ts/runtime-rpc/build/commonjs/rpc-interceptor.js:26:16)

Is there a way to make a server streaming client that works in NodeJS? I was trying to use the promise-client.

Provide versions of Twirp and GrpcWeb transport that don't require fetch API polyfills

The Twirp and GrpcWeb transports require polyfills for the fetch API on node.

React Native includes a polyfill for the fetch API , but it is incomplete, see #67.

Instead of relying on fetch API polyfills, it might be a better alternative to provide two new packages grpcweb-node-transport and tiwrp-node-transport which use only native node APIs for HTTP.

The downsides are added maintenance and more choices for packages to understand.

Regression: Custom field options are missing from reflection data

Custom field options do no longer show up. AnnotatedMessage of data fixture msg-annotated.proto should have this field info:

            { no: 5, name: "ann_enum", kind: "scalar", T: ScalarType.INT32, options: { "spec.opt_enum": "OPTION_ENUM_YES" } },

But the options key is missing completely.

Method options are still present.

[REQUEST] Remove extra layer of nesting for oneof fields

As far as I can tell in all other reference implementations the identifier that comes after oneof in the proto files is never referenced. For example:

message SampleMessage {
  oneof test_oneof {
    string name = 1;
  }
}

message SampleRequest {
  SampleMessage message = 1;
}

message SampleResponse {}

service SampleService {
  rpc SampleMethod(SampleRequest) returns (SampleResponse);
}

The reference C++ code

SampleMessage message;
message.set_name("name");

But the protobuf-ts generated code requires you to reference test_oneof.

const client = new SampleService();
const {response, status} = await client.sampleMethod({
  message: {
    testOneof: { // unnecessary layer of nesting 
      oneofKind: 'name',
      name: 'test'
    }
  }
});

It would be great to remove the requirement to reference the oneof identifier like the other reference implementations.

How to use google/protobuf/any.proto

// I am using Any type in one of my proto file 

import "google/protobuf/any.proto";

But when i am running the generator it is generating the import like :

import { Any } from "./google\\protobuf\\any";

image

lowerCaseServiceMethods & method name errors for twirp ClientImpl

Using --ts_proto_opt=lowerCaseServiceMethods=true on Haberdasher example the generated code still has pascal case MakeHat instead of makeHat. Also I believe "methodDesc.name" is incorrect, changing to "MakeHat" or "makeHat" if above is fixed lets client connect to a Twirp Server and receive results correctly.

the generated Impl

export class HaberdasherClientImpl implements Haberdasher {
  private readonly rpc: Rpc;
  constructor(rpc: Rpc) {
    this.rpc = rpc;
  }
  MakeHat(request: Size): Promise<Hat> {
    const data = Size.encode(request).finish();
    const promise = this.rpc.request(
      "users.Haberdasher",
      "methodDesc.name",
      data
    );
    return promise.then((data) => Hat.decode(new _m0.Reader(data)));
  }
}

Here is the full test.

import { HaberdasherClientImpl } from '@/generated/protos/haberdasher'
import axios from 'axios'

class TwirpRPC {
  constructor(
    public server = 'http://localhost:8080',
    public prefix = '/twirp',
  ) {}

  async request(
    service: string,
    method: string,
    data: Uint8Array,
  ): Promise<Uint8Array> {
    const { server, prefix } = this
    const url = `${server}${prefix}/${service}/${method}`
    const headers = {
      'Content-Type': 'application/protobuf',
    }

    const res = await axios({
      method: 'POST',
      url,
      headers,
      data: data.slice(),
      responseType: 'arraybuffer',
    })
    return new Uint8Array(res.data)
  }
}
export async function test() {
  const twirp = new TwirpRPC()
  const client = new HaberdasherClientImpl(twirp)
  const h = await client.MakeHat({ inches: 12345 })
  console.log(JSON.stringify(h, null, 2))
}

Or I'm using it incorrectly ?

ReferenceError: TextEncoder is not defined

Not sure if this is maybe specific to my version of node (v12), or my environment (typescript: 6.13.4, etc...)

When I try to call, MyMessage.toBinary(myMsg);
../../node_modules/@protobuf-ts/runtime/src/binary-writer.ts
throws ReferenceError: TextEncoder is not defined

Changing it to be an explicit import in the js fixed it for me, like:
Inserting: const util_1 = require("util"); after the imports on line 6.
And making line 13: this.textEncoder = textEncoder !== null && textEncoder !== void 0 ? textEncoder : new TextEncoder();
Read: this.textEncoder = textEncoder !== null && textEncoder !== void 0 ? textEncoder : new util_1.TextEncoder();

I haven't taken the time to checkout the project. But I'd be happy to make the change when I have time. I'd seek to add import { TextEncoder } from 'util'; in binary-writer.ts. It should be pretty safe, but I don't know if TextEncoder was meant to be taken from util or if there might have been a different package intended. ¯_(ツ)_/¯

I haven't gotten as far as decoding any messages, so new TextDecoder() may be an issue too if we initialize it like that somewhere.

Lmk if I'm way off here.

Provide and detect well known types

I have a SPA that will utilize multiple gRPC clients for various APIs. My APIs use many well known types and imports which end up generating the same dependencies but aren't deduped or anything so I get duplicative 100K+ code bundled into my app for each api I import and use.

Would it be possible to provide a @protobuf-ts/google package that contains some well known protobuf imports and the generator can detect and use this package anytime any of them are utilized?

Imports that I find myself using across APIs:

import "google/protobuf/any.proto";
import "google/protobuf/field_mask.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";
import "google/api/annotations.proto";
import "google/api/client.proto";
import "google/api/resource.proto";
import "google/api/field_behavior.proto";
import "google/rpc/status.proto";
import "google/type/date.proto";

This may also be due to my mono-repo design, instead of generating clients directly into my SPA app folder, I do so in a general genclients meant to provide each client as a separate package.

protos/
     api1.proto
     api2.proto
genclients/
      api1/
            /src
            /dist
            package.json
      api2/
            /src
            /dist
            package.json
apps/
      spa1/ (imports api1 only)
            /src
            package.json
      spa2/ (imports api1 & api2)
            /src
            package.json

I'll be moving my generator scripts to be application-specific and generate the clients into the app folders directly to get around this for now - so instead of my apps not being able to detect/dedup common dependencies, I'll just be duplicating the generated clients across apps in my repository.

Misc questions

First of all thanks for this great lib !

I have a few questions about the runtime:

The lib uses custom base64 functions. Have you considered using the platform instead ? (Buffer on node and atob and btoa in the browser). I understand that the current decode might be more lenient but I'm not sure if this is really needed and if it is it should not be to hard to implement.

Have you considered using const enums in the runtime ? i.e. for ScalarType. This could also save some bytes - it would technically be a breaking change if someone uses the enum in their code.

I am happy to submit PRs for those if it can help.

Thanks

Provide a gRPC transport

We have transports for Twirp and gRPC-web.

But there is no transport for gRPC yet. Having at least client-support for gRPC would be nice.

@grpc/grpc-js mentioned in #44 looks like a good base for an implementation.

TODO

  • README and MANUAL
  • unary methods
  • server streaming methods
  • client streaming methods
  • bidi streaming methods -> need a better server to test against
  • example package

ReactNative GrpcWebFetchTransport produces RpcError: premature end of response

Hi I am building an android app using React Native and wish to use your library to help with making gRPC requests to my gRPC server, I have managed to successfully get it working without problems using GrpcWebFetchTransport on a pure react application.

But the same code appears to have issues when running on Android using React Native.

I imagine that it has something to do with the Transport I'm using while running on android because I can clearly see that my request is getting sent to the server and processed correctly but the issue lies in the react native side, specifically trying to read the response.

For reference I have tried the other transports you listed in the manual but GrpcWebFetchTransport was the only one that managed to make a request, it just happened to fail on reading the response.

RpcError: premature end of response
    at Object.readGrpcWebResponseHeader (http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:152282:38)
    at runtime_rpc_1.UnaryCall._b (http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:152106:58)
    at tryCallOne (http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:27058:14)
    at http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:27159:17
    at http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:30710:21
    at _callTimer (http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:30598:9)
    at _callImmediatesPass (http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:30637:7)
    at MessageQueue.callImmediates [as _immediatesCallback] (http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:30854:14)
    at MessageQueue.__callImmediates (http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:2738:16)
    at http://localhost:8081/index.bundle?platform=android&dev=true&minify=false:2524:18

Note: that my gRPC server only accepts binary requests and only unary calls.

example proto

syntax = "proto3";

// The greeting service definition.
service Greeter {
  // Sends a greeting
  rpc SayHello (HelloRequest) returns (HelloReply) {}
}

// The request message containing the user's name.
message HelloRequest {
  string name = 1;
}

// The response message containing the greetings
message HelloReply {
  string message = 1;
}

Client code

import {GrpcWebFetchTransport} from "@protobuf-ts/grpcweb-transport";
let transport = new GrpcWebFetchTransport({
    format: "binary",
    baseUrl: "http://localhost:50051",
  });
 let client = new GreeterClient(transport);
 let {response} = await client.sayHello({ name: "test" });
console.log("grpcResponse:  " + response)
}

I have been able to successfully make a gRPC request using https://github.com/improbable-eng/grpc-web/tree/master/client/grpc-web-react-native-transport but this requires me to generate the proto file using a completely diffrent method and moving away from your library which I don't want to do.

Using the Improbable-end/grpc-web-react-native transport. It seems to be using XHR but im not sure if thats the reason that one works and your GrpcWebFetch doesn't.

Any ideas?

[Question] Can't extends generated classes.

Can the protobuf-ts generate files with the opportunity for exporting the classes instead of the instances? I am asking because I would like to extend classes to write custom functions.

Maybe you recommend something?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.