Giter Site home page Giter Site logo

danielgtaylor / python-betterproto Goto Github PK

View Code? Open in Web Editor NEW
1.4K 23.0 187.0 1.15 MB

Clean, modern, Python 3.6+ code generator & library for Protobuf 3 and async gRPC

License: MIT License

Python 95.11% Batchfile 0.02% Jinja 3.03% Java 1.85%
python-3 protocol-buffer-compiler asyncio grpc code-generator plugin

python-betterproto's Introduction

Better Protobuf / gRPC Support for Python

:octocat: If you're reading this on github, please be aware that it might mention unreleased features! See the latest released README on pypi.

This project aims to provide an improved experience when using Protobuf / gRPC in a modern Python environment by making use of modern language features and generating readable, understandable, idiomatic Python code. It will not support legacy features or environments (e.g. Protobuf 2). The following are supported:

  • Protobuf 3 & gRPC code generation
    • Both binary & JSON serialization is built-in
  • Python 3.7+ making use of:
    • Enums
    • Dataclasses
    • async/await
    • Timezone-aware datetime and timedelta objects
    • Relative imports
    • Mypy type checking
  • Pydantic Models generation (see #generating-pydantic-models)

This project is heavily inspired by, and borrows functionality from:

Motivation

This project exists because I am unhappy with the state of the official Google protoc plugin for Python.

  • No async support (requires additional grpclib plugin)
  • No typing support or code completion/intelligence (requires additional mypy plugin)
  • No __init__.py module files get generated
  • Output is not importable
    • Import paths break in Python 3 unless you mess with sys.path
  • Bugs when names clash (e.g. codecs package)
  • Generated code is not idiomatic
    • Completely unreadable runtime code-generation
    • Much code looks like C++ or Java ported 1:1 to Python
    • Capitalized function names like HasField() and SerializeToString()
    • Uses SerializeToString() rather than the built-in __bytes__()
    • Special wrapped types don't use Python's None
    • Timestamp/duration types don't use Python's built-in datetime module

This project is a reimplementation from the ground up focused on idiomatic modern Python to help fix some of the above. While it may not be a 1:1 drop-in replacement due to changed method names and call patterns, the wire format is identical.

Installation

First, install the package. Note that the [compiler] feature flag tells it to install extra dependencies only needed by the protoc plugin:

# Install both the library and compiler
pip install "betterproto[compiler]"

# Install just the library (to use the generated code output)
pip install betterproto

Betterproto is under active development. To install the latest beta version, use pip install --pre betterproto.

Getting Started

Compiling proto files

Given you installed the compiler and have a proto file, e.g example.proto:

syntax = "proto3";

package hello;

// Greeting represents a message you can tell a user.
message Greeting {
  string message = 1;
}

You can run the following to invoke protoc directly:

mkdir lib
protoc -I . --python_betterproto_out=lib example.proto

or run the following to invoke protoc via grpcio-tools:

pip install grpcio-tools
python -m grpc_tools.protoc -I . --python_betterproto_out=lib example.proto

This will generate lib/hello/__init__.py which looks like:

# Generated by the protocol buffer compiler.  DO NOT EDIT!
# sources: example.proto
# plugin: python-betterproto
from dataclasses import dataclass

import betterproto


@dataclass
class Greeting(betterproto.Message):
    """Greeting represents a message you can tell a user."""

    message: str = betterproto.string_field(1)

Now you can use it!

>>> from lib.hello import Greeting
>>> test = Greeting()
>>> test
Greeting(message='')

>>> test.message = "Hey!"
>>> test
Greeting(message="Hey!")

>>> serialized = bytes(test)
>>> serialized
b'\n\x04Hey!'

>>> another = Greeting().parse(serialized)
>>> another
Greeting(message="Hey!")

>>> another.to_dict()
{"message": "Hey!"}
>>> another.to_json(indent=2)
'{\n  "message": "Hey!"\n}'

Async gRPC Support

The generated Protobuf Message classes are compatible with grpclib so you are free to use it if you like. That said, this project also includes support for async gRPC stub generation with better static type checking and code completion support. It is enabled by default.

Given an example service definition:

syntax = "proto3";

package echo;

message EchoRequest {
  string value = 1;
  // Number of extra times to echo
  uint32 extra_times = 2;
}

message EchoResponse {
  repeated string values = 1;
}

message EchoStreamResponse  {
  string value = 1;
}

service Echo {
  rpc Echo(EchoRequest) returns (EchoResponse);
  rpc EchoStream(EchoRequest) returns (stream EchoStreamResponse);
}

Generate echo proto file:

python -m grpc_tools.protoc -I . --python_betterproto_out=. echo.proto

A client can be implemented as follows:

import asyncio
import echo

from grpclib.client import Channel


async def main():
    channel = Channel(host="127.0.0.1", port=50051)
    service = echo.EchoStub(channel)
    response = await service.echo(echo.EchoRequest(value="hello", extra_times=1))
    print(response)

    async for response in service.echo_stream(echo.EchoRequest(value="hello", extra_times=1)):
        print(response)

    # don't forget to close the channel when done!
    channel.close()


if __name__ == "__main__":
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

which would output

EchoResponse(values=['hello', 'hello'])
EchoStreamResponse(value='hello')
EchoStreamResponse(value='hello')

This project also produces server-facing stubs that can be used to implement a Python gRPC server. To use them, simply subclass the base class in the generated files and override the service methods:

import asyncio
from echo import EchoBase, EchoRequest, EchoResponse, EchoStreamResponse
from grpclib.server import Server
from typing import AsyncIterator


class EchoService(EchoBase):
    async def echo(self, echo_request: "EchoRequest") -> "EchoResponse":
        return EchoResponse([echo_request.value for _ in range(echo_request.extra_times)])

    async def echo_stream(self, echo_request: "EchoRequest") -> AsyncIterator["EchoStreamResponse"]:
        for _ in range(echo_request.extra_times):
            yield EchoStreamResponse(echo_request.value)


async def main():
    server = Server([EchoService()])
    await server.start("127.0.0.1", 50051)
    await server.wait_closed()

if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

JSON

Both serializing and parsing are supported to/from JSON and Python dictionaries using the following methods:

  • Dicts: Message().to_dict(), Message().from_dict(...)
  • JSON: Message().to_json(), Message().from_json(...)

For compatibility the default is to convert field names to camelCase. You can control this behavior by passing a casing value, e.g:

MyMessage().to_dict(casing=betterproto.Casing.SNAKE)

Determining if a message was sent

Sometimes it is useful to be able to determine whether a message has been sent on the wire. This is how the Google wrapper types work to let you know whether a value is unset, set as the default (zero value), or set as something else, for example.

Use betterproto.serialized_on_wire(message) to determine if it was sent. This is a little bit different from the official Google generated Python code, and it lives outside the generated Message class to prevent name clashes. Note that it only supports Proto 3 and thus can only be used to check if Message fields are set. You cannot check if a scalar was sent on the wire.

# Old way (official Google Protobuf package)
>>> mymessage.HasField('myfield')

# New way (this project)
>>> betterproto.serialized_on_wire(mymessage.myfield)

One-of Support

Protobuf supports grouping fields in a oneof clause. Only one of the fields in the group may be set at a given time. For example, given the proto:

syntax = "proto3";

message Test {
  oneof foo {
    bool on = 1;
    int32 count = 2;
    string name = 3;
  }
}

On Python 3.10 and later, you can use a match statement to access the provided one-of field, which supports type-checking:

test = Test()
match test:
    case Test(on=value):
        print(value)  # value: bool
    case Test(count=value):
        print(value)  # value: int
    case Test(name=value):
        print(value)  # value: str
    case _:
        print("No value provided")

You can also use betterproto.which_one_of(message, group_name) to determine which of the fields was set. It returns a tuple of the field name and value, or a blank string and None if unset.

>>> test = Test()
>>> betterproto.which_one_of(test, "foo")
["", None]

>>> test.on = True
>>> betterproto.which_one_of(test, "foo")
["on", True]

# Setting one member of the group resets the others.
>>> test.count = 57
>>> betterproto.which_one_of(test, "foo")
["count", 57]

# Default (zero) values also work.
>>> test.name = ""
>>> betterproto.which_one_of(test, "foo")
["name", ""]

Again this is a little different than the official Google code generator:

# Old way (official Google protobuf package)
>>> message.WhichOneof("group")
"foo"

# New way (this project)
>>> betterproto.which_one_of(message, "group")
["foo", "foo's value"]

Well-Known Google Types

Google provides several well-known message types like a timestamp, duration, and several wrappers used to provide optional zero value support. Each of these has a special JSON representation and is handled a little differently from normal messages. The Python mapping for these is as follows:

Google Message Python Type Default
google.protobuf.duration datetime.timedelta 0
google.protobuf.timestamp Timezone-aware datetime.datetime 1970-01-01T00:00:00Z
google.protobuf.*Value Optional[...] None
google.protobuf.* betterproto.lib.google.protobuf.* None

For the wrapper types, the Python type corresponds to the wrapped type, e.g. google.protobuf.BoolValue becomes Optional[bool] while google.protobuf.Int32Value becomes Optional[int]. All of the optional values default to None, so don't forget to check for that possible state. Given:

syntax = "proto3";

import "google/protobuf/duration.proto";
import "google/protobuf/timestamp.proto";
import "google/protobuf/wrappers.proto";

message Test {
  google.protobuf.BoolValue maybe = 1;
  google.protobuf.Timestamp ts = 2;
  google.protobuf.Duration duration = 3;
}

You can do stuff like:

>>> t = Test().from_dict({"maybe": True, "ts": "2019-01-01T12:00:00Z", "duration": "1.200s"})
>>> t
Test(maybe=True, ts=datetime.datetime(2019, 1, 1, 12, 0, tzinfo=datetime.timezone.utc), duration=datetime.timedelta(seconds=1, microseconds=200000))

>>> t.ts - t.duration
datetime.datetime(2019, 1, 1, 11, 59, 58, 800000, tzinfo=datetime.timezone.utc)

>>> t.ts.isoformat()
'2019-01-01T12:00:00+00:00'

>>> t.maybe = None
>>> t.to_dict()
{'ts': '2019-01-01T12:00:00Z', 'duration': '1.200s'}

Generating Pydantic Models

You can use python-betterproto to generate pydantic based models, using pydantic dataclasses. This means the results of the protobuf unmarshalling will be typed checked. The usage is the same, but you need to add a custom option when calling the protobuf compiler:

protoc -I . --python_betterproto_opt=pydantic_dataclasses --python_betterproto_out=lib example.proto

With the important change being --python_betterproto_opt=pydantic_dataclasses. This will swap the dataclass implementation from the builtin python dataclass to the pydantic dataclass. You must have pydantic as a dependency in your project for this to work.

Development

Requirements

  • Python (3.7 or higher)

  • poetry Needed to install dependencies in a virtual environment

  • poethepoet for running development tasks as defined in pyproject.toml

    • Can be installed to your host environment via pip install poethepoet then executed as simple poe
    • or run from the poetry venv as poetry run poe

Setup

# Get set up with the virtual env & dependencies
poetry install -E compiler

# Activate the poetry environment
poetry shell

Code style

This project enforces black python code formatting.

Before committing changes run:

poe format

To avoid merge conflicts later, non-black formatted python code will fail in CI.

Tests

There are two types of tests:

  1. Standard tests
  2. Custom tests

Standard tests

Adding a standard test case is easy.

  • Create a new directory betterproto/tests/inputs/<name>
    • add <name>.proto with a message called Test
    • add <name>.json with some test data (optional)

It will be picked up automatically when you run the tests.

Custom tests

Custom tests are found in tests/test_*.py and are run with pytest.

Running

Here's how to run the tests.

# Generate assets from sample .proto files required by the tests
poe generate
# Run the tests
poe test

To run tests as they are run in CI (with tox) run:

poe full-test

(Re)compiling Google Well-known Types

Betterproto includes compiled versions for Google's well-known types at src/betterproto/lib/google. Be sure to regenerate these files when modifying the plugin output format, and validate by running the tests.

Normally, the plugin does not compile any references to google.protobuf, since they are pre-compiled. To force compilation of google.protobuf, use the option --custom_opt=INCLUDE_GOOGLE.

Assuming your google.protobuf source files (included with all releases of protoc) are located in /usr/local/include, you can regenerate them as follows:

protoc \
    --plugin=protoc-gen-custom=src/betterproto/plugin/main.py \
    --custom_opt=INCLUDE_GOOGLE \
    --custom_out=src/betterproto/lib \
    -I /usr/local/include/ \
    /usr/local/include/google/protobuf/*.proto

TODO

  • Fixed length fields
    • Packed fixed-length
  • Zig-zag signed fields (sint32, sint64)
  • Don't encode zero values for nested types
  • Enums
  • Repeated message fields
  • Maps
    • Maps of message fields
  • Support passthrough of unknown fields
  • Refs to nested types
  • Imports in proto files
  • Well-known Google types
    • Support as request input
    • Support as response output
      • Automatically wrap/unwrap responses
  • OneOf support
    • Basic support on the wire
    • Check which was set from the group
    • Setting one unsets the others
  • JSON that isn't completely naive.
    • 64-bit ints as strings
    • Maps
    • Lists
    • Bytes as base64
    • Any support
    • Enum strings
    • Well known types support (timestamp, duration, wrappers)
    • Support different casing (orig vs. camel vs. others?)
  • Async service stubs
    • Unary-unary
    • Server streaming response
    • Client streaming request
  • Renaming messages and fields to conform to Python name standards
  • Renaming clashes with language keywords
  • Python package
  • Automate running tests
  • Cleanup!

Community

Join us on Slack!

License

Copyright ยฉ 2019 Daniel G. Taylor

http://dgt.mit-license.org/

python-betterproto's People

Contributors

124c41p avatar a-khabarov avatar abn avatar adriangb avatar anriha avatar atomicmac avatar badge avatar boukeversteegh avatar cetanu avatar danielgtaylor avatar dependabot[bot] avatar efokschaner avatar emosenkis avatar fuegofro avatar gobot1234 avatar jameslan avatar joshualeivers avatar kalzoo avatar luminaar avatar marekpikula avatar matejkastak avatar micaeljarniac avatar michaelosthege avatar nat-n avatar nickderobertis avatar olijeffers0n avatar qria avatar samuelyvon avatar ulasozguler avatar w4rum avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

python-betterproto's Issues

Add black to build pipeline and readme

As @nat-n suggested, enforcing black as a validation step in the pipeline would reduce the number of merge-conflicts, and make sure that all contributors are aware that this project requires the use of black.

We could also mention it in the readme, or keep the readme short and let contributors find out after pushing

I prefer mentioning it.. as before black, i did spend some time formatting my code, which turned out to be a wasted effort

Split up the protoc plugin to make gRPC content configurable

Currently there is a single plugin with limited support for generating gRPC client code, and there is demand to cover servicer generation #19, with grpclib or grpcio, or none #44 #20 .

Since AFAIK there is not standard way to provide arguments to protoc plugins, I propose that the plugin be exposed as multiple distinct executables, one to generate just the protobuf dataclasses, one to generate the dataclasses and grpclib client/server and one to generate dataclasses and grpcio client/server.

Are there other approaches worth considering?

Google Types - Any support

Currently, betterproto does not understand Any messages, except that they have a url and bytes.

If you have suggestions for extended support, please be welcome to make them here.

Missing support for request metadata in stubs

Some API's require adding metadata to requests, such as for authentication but this is not supported by the generated stubs.

To support this would require modifying the _unary_unary( and _unary_stream(
to accept a metadata kwarg and pass it on to self.channel.request. Additionally for use cases like mine, having the option to set the default metadata on the Stub instance would be most useful.

I can create a PR if this approach is acceptable in principal.

ALL_CAPS message fields are parsed incorrectly.

$ cat filedata.proto
syntax = "proto3";
package filedata;

message FileData {
  uint32 FRAGMENT_NR = 1;
  bytes DATA = 2;
  uint64 UID = 3;
}

$ protoc --python_betterproto_out . filedata.proto
Writing __init__.py
Writing filedata.py
$ cat filedata.py
# Generated by the protocol buffer compiler.  DO NOT EDIT!
# sources: filedata.proto
# plugin: python-betterproto
from dataclasses import dataclass

import betterproto


@dataclass
class FileData(betterproto.Message):
    f_r_a_g_m_e_n_t__n_r: int = betterproto.uint32_field(1)
    d_a_t_a: bytes = betterproto.bytes_field(2)
    u_i_d: int = betterproto.uint64_field(3)
  • protoc 3.6.1
  • betterproto 1.2.2 (installed through pip)
  • python 3.7.3

I assume this has something to do with the parser thinking they are enums?

Rewriting our schema is unfortunately not an option, it's made to be backwards-compatible and encompasses thousands of fields over several hundred message types.

How to determine if field was set (not oneof)?

Vector tile spec -- a commonly used format for map data -- contains this message definition. Is there a way I can find out which one of the optional values was set? Or perhaps make all of them default to None instead of type-specific default values? As a workaround, I had to rewrite the protobuf spec to wrap them as oneof ... and removing extensions, but clearly this is not ideal to maintain my own version of slightly incompatible spec. Thanks!

        message Value {
                // Exactly one of these values must be present in a valid message
                optional string string_value = 1;
                optional float float_value = 2;
                optional double double_value = 3;
                optional int64 int_value = 4;
                optional uint64 uint_value = 5;
                optional sint64 sint_value = 6;
                optional bool bool_value = 7;

                extensions 8 to max;
        }

which generates this relevant code:

@dataclass
class TileValue(betterproto.Message):
    # Exactly one of these values must be present in a valid message
    string_value: str = betterproto.string_field(1)
    float_value: float = betterproto.float_field(2)
    double_value: float = betterproto.double_field(3)
    int_value: int = betterproto.int64_field(4)
    uint_value: int = betterproto.uint64_field(5)
    sint_value: int = betterproto.sint64_field(6)
    bool_value: bool = betterproto.bool_field(7)

@dataclass
class TileLayer(betterproto.Message):
    values: List["TileValue"] = betterproto.message_field(4)
    ...

Optimize dataclasses with __slots__

Adding __slots__ to a class improves memory usage and performance of attribute access, at the expense of not being able to set arbitrary attributes on the resulting objects.

This project demonstrates how dataclasses can be augmented to use slots, whilst still having class attributes set (normally adding the slots attribute to a class definition would cause an error with dataclassess that declare default values).

Making generated dataclasses use slots in this way would be a small breaking change, though I don't see how it could be too inconvenient to work with, and I expect the performance improvement would be worthwhile.

fixed/sfixed types convert to float

According to the protobuf language guide, fixednn and sfixednn are integer types. However, when betterproto generates dataclasses it treats them as floats.

$ cat foo.proto
syntax = "proto3";
package foo;

message Foo {
  fixed32 bar = 1;
  sfixed32 baz = 2;
  fixed64 qux = 3;
  sfixed64 quux = 4;
}

$ protoc --python_betterproto_out . foo.proto
Writing foo.py
$ cat foo.py
# Generated by the protocol buffer compiler.  DO NOT EDIT!
# sources: foo.proto
# plugin: python-betterproto
from dataclasses import dataclass

import betterproto


@dataclass
class Foo(betterproto.Message):
    bar: float = betterproto.fixed32_field(1)
    baz: float = betterproto.sfixed32_field(2)
    qux: float = betterproto.fixed64_field(3)
    quux: float = betterproto.sfixed64_field(4)

  • betterproto 1.2.2 (from pypi)
  • protoc 3.6.1
  • python 3.7.3

Generation without gRPC

I would like to use betterproto for the generation of code from proto messages but without the usage of gRPC. However, there are still services defined in my .proto files for which I will be generating code with a different plugin. Thus, I need to disable the service generation of the betterproto plugin which, as far as I can tell, is not possible at the moment.

Importing well known types

Hello,

I'm not sure how to work with well known types.

Having a proto file like this:

syntax = "proto3";      
      
package simple;      
      
import "google/protobuf/empty.proto";      
      
service SimpleSender {      
    rpc Send(SendParams) returns (google.protobuf.Empty) {}      
}      
      
message SendParams {      
    string body = 1;      
}

I get output like this:

# Generated by the protocol buffer compiler.  DO NOT EDIT!      
# sources: Simple.proto      
# plugin: python-betterproto      
from dataclasses import dataclass      
      
import betterproto      
import grpclib          
      
from .google import protobuf      
      
      
@dataclass      
class SendParams(betterproto.Message):      
    body: str = betterproto.string_field(1)      
      
      
class SimpleSenderStub(betterproto.ServiceStub):      
    async def send(self, *, body: str = "") -> protobuf.Empty:      
        request = SendParams()      
        request.body = body         
      
        return await self._unary_unary(      
            "/simple.SimpleSender/Send", request, protobuf.Empty,      
        )

There is ofcourse no google package. I tried to generate code from proto files included in grpc_tools but I'm not able to get any python code.

What is the best way to work with well known types?

Import bug - two packages with the same name suffix should not cause naming conflict

I am keen to use better proto for a project that contains a simple heirarchy of protobuf files.
There is a top level proto file which contains the include statements:

package adapter.v1;

import "Src/Products/HFSS/Adapter/v1/API/aedt-method-data-v0.proto";
import "Src/Products/HFSS/Adapter/v1/API/aedt-properties-v0.proto";

The imported proto have package definitions of the form:

package aedt.method_data.v0;

and

package aedt.properties.v0;

In the python generated by better_proto for the top level proto I see:

from .aedt.method_data import v0
from .aedt.properties import v0

with an example reference:

@dataclass
class UpdateBigHouseRequest(betterproto.Message):
    # The big_house resource which replaces the resource on the server.
    big_house: v0.BigHouse = betterproto.message_field(1)
    # The update mask applies to the resource. For the `FieldMask` definition,
    # see https://developers.google.com/protocol-
    # buffers/docs/reference/google.protobuf#fieldmask The grpc gateway will add
    # entries for all fields in the request body JSON (i.e. the payload above) to
    # the update_mask. This means a REST client needs to filter-out fields not
    # present in the update_mask from the body JSON it sends.
    update_mask: protobuf.FieldMask = betterproto.message_field(2)

I hope you agree this is incorrect.

I think the solution is to revise the code in plugin.py lines 79 and 80 in the get_ref_type function
to something like:

        imports.add(f"import {'.'.join(parts[:-1])}")
        type_name = '.'.join(parts)

I have not tried this as I have had difficulty building from a clone of the repository.
(perhaps the following should be separate issue...)

I tried
python -m pipenv install --dev
at the root of the clone
and got:

Traceback (most recent call last):
  File "C:\Users\afinney\AppData\Local\Programs\Python\Python38\lib\runpy.py", line 193, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "C:\Users\afinney\AppData\Local\Programs\Python\Python38\lib\runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\__main__.py", line 4, in <module>
    cli()
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\click\core.py", line 764, in __call__
    return self.main(*args, **kwargs)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\click\core.py", line 717, in main
    rv = self.invoke(ctx)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\click\core.py", line 1137, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\click\core.py", line 956, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\click\core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\click\decorators.py", line 64, in new_func
    return ctx.invoke(f, obj, *args, **kwargs)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\click\core.py", line 555, in invoke
    return callback(*args, **kwargs)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\click\decorators.py", line 17, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\cli\command.py", line 235, in install
    retcode = do_install(
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\core.py", line 1734, in do_install
    ensure_project(
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\core.py", line 570, in ensure_project
    ensure_virtualenv(
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\core.py", line 494, in ensure_virtualenv
    python = ensure_python(three=three, python=python)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\core.py", line 397, in ensure_python
    path_to_python = find_a_system_python(python)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\core.py", line 360, in find_a_system_python
    python_entry = finder.find_python_version(line)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\pythonfinder\pythonfinder.py", line 108, in find_python_version
    match = self.windows_finder.find_python_version(
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\pythonfinder\pythonfinder.py", line 63, in windows_finder
    self._windows_finder = WindowsFinder()
  File "<attrs generated init 7fb3eb7472e6350f51fd6e21deda174661985e11>", line 13, in __init__
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\pythonfinder\models\windows.py", line 92, in get_versions
    py_version = PythonVersion.from_windows_launcher(version_object)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\pythonfinder\models\python.py", line 417, in from_windows_launcher
    creation_dict = cls.parse(launcher_entry.info.version)
  File "C:\Users\afinney\AppData\Roaming\Python\Python38\site-packages\pipenv\vendor\pythonfinder\_vendor\pep514tools\_registry.py", line 75, in __getattr__
    raise AttributeError(attr)
AttributeError: version

I then tried

python setup.py develop

which appeared to successfully create the protoc plugin
but running the protoc with the generated plugin did not create any generated code

Service response values are not unwrapped

As betterproto is aiming to support automatic (un)wrapping of well known types (google wrapper values), the service responses should also return the underlying values, instead of the wrapper objects.

syntax = "proto3";

import "google/protobuf/wrappers.proto";

// Tests that wrapped values can be used directly as return values

service Test {
    rpc GetDouble (Input) returns (google.protobuf.DoubleValue);
}

message Input {

}
response = TestStub(channel).get_double()

assert type(response) == float # fails!

Crash when field has the same name as a system type

This oneof definition generates valid Python code, but later it is not possible to parse or create an instance of TileValue because the name of the field is the same as the type. Should betterproto rename such fields automatically, or at least throw an error during the code generation? I spent considerable time trying to understand why this line was saying t is not a class:

elif issubclass(t, Enum):

message Tile {
        message Value {
                oneof val {
                        string string = 1;
                        float float = 2;
                        double double = 3;
                        int64 int = 4;
                        uint64 uint = 5;
                        sint64 sint = 6;
                        bool bool = 7;
                }
        }

        repeated Value values = 1;
}
@dataclass
class Tile(betterproto.Message):
    values: List["TileValue"] = betterproto.message_field(1)


@dataclass
class TileValue(betterproto.Message):
    string: str = betterproto.string_field(1, group="val")
    float: float = betterproto.float_field(2, group="val")
    double: float = betterproto.double_field(3, group="val")
    int: int = betterproto.int64_field(4, group="val")
    uint: int = betterproto.uint64_field(5, group="val")
    sint: int = betterproto.sint64_field(6, group="val")
    bool: bool = betterproto.bool_field(7, group="val")

Gracefully handle encoding of different number types

If a float is set on a Message field of type int then an exception is raised when serialising. Floats and ints are largely interchangeable in python3, so maybe betterproto should handle this more gracefully, by for example: casting to int or float as appropriate before encoding.

The resulting error message includes:

unsupported operand type(s) for &: 'float' and 'int'

Reflection doesn't work with grpclib

Steps to repro: generate any proto, create a grpclib server, then add on reflection. Lastly, try asking for the reflection info using grpcurl: it'll fail.

Slack chat is invite only?

Hey folks,

I'm looking to get involved with betterproto development and tried to join slack. It seems like it is invite only though? Is there a way to get an invite? Is there a public slack people can easily join?

This is the link I tried https://betterproto.slack.com/

Screen Shot 2020-06-09 at 3 29 40 PM

support for proto3 `optional` (explicit field presence)

In betterproto it seems like all fields are optional, in that proto.field = None is a valid operation. However, there doesn't seem to be any way to indicate a missing field when using the constructor; instead, missing fields get their default values (e.g., 0 for an int).

message Pizza {
  string name  = 1;
  int32 radius = 2; 
}

>>> pizza = betterproto.Pizza(name='neapolitan', radius=9)
betterproto.Pizza(name='neapolitan', radius=9)
>>> pizza.radius = None
>>> pizza
betterproto.Pizza(name='neapolitan', radius=None)

>>> pizza = betterproto.Pizza(name='neapolitan')
betterproto.Pizza(name='neapolitan', radius=0)

>>> pizza = betterproto.Pizza(name='neopolitan', radius=None)
*** SyntaxError: invalid syntax

If I understand this announcement correctly, the optional keyword is now available in proto3 in experimental release 3.12. Is there any plan for the betterproto compiler to take advantage of the new proto3 optional keyword to differentiate optional and required fields? It would be great if only explicitly optional fields permitted assignment of None, whether directly or through the constructor.

Thank you for your great work!

I'm searching for a gRPC Python Client suite for me for the past two days for my project @chatie/grpc (it has already a gRPC server in TypeScript), I read lots of articles (and codes), but found none can fit my needs until I have been here!

My requirements are:

  1. static typing
  2. static client method

Thank you for your great work for saving my day!

Editable package "v" makes newest pipenv fail

In Pipfile, there's such line in dev-packages:

v = {editable = true,version = "*"}

pipenv 2018.11.26 doesn't complain but the head of pipenv master branch will fail because an editable package should be a local path or git repo URL.

Also,

$ pip install -e v
ERROR: v is not a valid editable requirement. It should either be a path to a local project or a VCS URL (beginning with svn+, git+, hg+, or bzr+).

Import Bug - No arguments are generated for stub methods when using `import` with proto definition

Indirect recursive message - RecursionError: maximum recursion depth hit on encoding initial default values

Hi there,
I really prefer the betterproto probuf output to Google's, so well done on that! I've come across the following issue, and narrowed it down to the simplest proto I can to reproduce.

Initialising the following proto, (after compiling with python-betterproto)

syntax = "proto3";
package schemas;

message Action {
    oneof action_oneof {
        ref not_repeated = 2;
    }
}

message ref {
    string name = 2;
    Action action = 3;
}

with a simple test = schemas.Action()

is causing recursion and a failure to create the test object. It looks as far as I can tell that its attempting to work out the default values for the initial proto.

File "<string>", line 4, in __init__
  File "/lib/python3.7/site-packages/betterproto/__init__.py", line 460, in __post_init__
    setattr(self, field.name, self._get_field_default(field, meta))
  File "/lib/python3.7/site-packages/betterproto/__init__.py", line 593, in _get_field_default
    value = t()
  File "<string>", line 3, in __init__
  File "/lib/python3.7/site-packages/betterproto/__init__.py", line 460, in __post_init__
    setattr(self, field.name, self._get_field_default(field, meta))

repeated then
RecursionError: maximum recursion depth exceeded while calling a Python object

When the proto definition is changed to

message ref {
    string name = 2;
    repeated Action action = 3;
}

then the recursion error does not occur.

The standard 'protoc --python_out=' does not exhibit this behaviour, hence I thought I'd raise this.
cheers
Nick

Installation fails on some systems

I'm a bit baffled about the root cause, but I have simple reproduction steps and a simple fix.

The issue can be reproduced by running this Dockerfile:

FROM centos:7.5.1804
RUN yum update -y && yum install -y python3
RUN python3 -m pip install --upgrade pip
CMD ["python3", "-m", "pip", "install", "betterproto"]

like: docker run --rm -it $(docker build -q .)

Will produce output like:

Collecting betterproto
  Downloading betterproto-1.2.3.tar.gz (24 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  ERROR: Command errored out with exit status 1:
   command: /usr/bin/python3 /usr/local/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py get_requires_for_build_wheel /tmp/tmptildjr2u
       cwd: /tmp/pip-install-tgfsl0y5/betterproto
  Complete output (20 lines):
  Traceback (most recent call last):
    File "/usr/local/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 280, in <module>
      main()
    File "/usr/local/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 263, in main
      json_out['return_val'] = hook(**hook_input['kwargs'])
    File "/usr/local/lib/python3.6/site-packages/pip/_vendor/pep517/_in_process.py", line 114, in get_requires_for_build_wheel
      return hook(config_settings)
    File "/tmp/pip-build-env-eqgnnb9a/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 148, in get_requires_for_build_wheel
      config_settings, requirements=['wheel'])
    File "/tmp/pip-build-env-eqgnnb9a/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 128, in _get_build_requires
      self.run_setup()
    File "/tmp/pip-build-env-eqgnnb9a/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 250, in run_setup
      self).run_setup(setup_script=setup_script)
    File "/tmp/pip-build-env-eqgnnb9a/overlay/lib/python3.6/site-packages/setuptools/build_meta.py", line 143, in run_setup
      exec(compile(code, __file__, 'exec'), locals())
    File "setup.py", line 7, in <module>
      long_description=open("README.md", "r").read(),
    File "/usr/lib64/python3.6/encodings/ascii.py", line 26, in decode
      return codecs.ascii_decode(input, self.errors)[0]
  UnicodeDecodeError: 'ascii' codec can't decode byte 0xc2 in position 11237: ordinal not in range(128)

The issue is that the setup.py reads the README, apparently in ascii mode instead of utf-8 even though it's python 3.6.8 (which doesn't behave this one on macOS for example), and seems to find some non-ascii characters. It's not obvious exactly what's wrong with the readme, but the setup.py if easily fixed by updating line 7 to to force utf-8 like:

long_description=open("README.md", "r", encoding="utf-8").read(),

Is proto2 syntax supported?

The example in the readme uses proto3 syntax. Is proto2 syntax also supported? Please mention this in the readme. Thx

Split code up into multiple modules

Currently both the plugin and runtime library as single modules. Maybe there's an element of personal preference here but I think splitting them up into more single purpose modules (and possibly sub-packages) would make the code easier to understand and work with.

It would also add some flexibility for optimisations such as only importing the parts that are required, e.g. excluding client/server code if not needed.

I'd be happy to work on splitting these modules up if there are no objections (maybe at a time when there are fewer open PRs). @danielgtaylor

Import bug - Message from Capitalized package is mistaken for Nested Type

The compiler does not distinguish the following two cases:

package somepackage.Core;

message Message {}
package somepackage;

message Core {
  message Message {}
}

as they are both:

somepackage.Core.Message

Protoc (as far as I've been able to figure out) reports the full type name, but not which part of that is the package, and which is the real type (somepackage.Core: Message vs somepackage: Core.Message).

In 99% of cases, this will not be a problem, because if users will follow the Protobuf Style Guide, packages will be lowercase, and types CapitalCamelCase, and the distinction can be made based on that.

https://developers.google.com/protocol-buffers/docs/style#packages

Package name should be in lowercase, and should correspond to the directory hierarchy. e.g., if a file is in my/package/, then the package name should be my.package.

https://developers.google.com/protocol-buffers/docs/style?hl=cs-CZ#message-and-field-names

Use CamelCase (with an initial capital) for message names โ€“ for example, SongServerRequest. Use underscore_separated_names for field names (including oneof field and extension names) โ€“ for example, song_name.

Despite these style guides, there may be some users who are using Capitalized packages and for some reason cannot change the packages to lowercase.

Then, these packages will not be imported correctly from depending packages / messages.

PascalCase support

For a PascalCase field, betterproto uses the snake case in the class field, and can handle multiple cases in input json. However, it converts it to the camel case in output json, while protobuf requires a pascal case in the input. This makes the json incompatible.

For example,

message Test {
  int32 camelCase = 1;
  my_enum snake_case = 2;
  snake_case_message snake_case_message = 3;
  int32 PascalCase =4;
}

betterproto will output

{"camelCase": 1, "pascalCase": 3, "snakeCase": "ONE"}

but protobuf is expecting

{"PascalCase": 3, "camelCase": 1, "snakeCase": "ONE"}

AttributeError: type object 'MyType' has no attribute '_serialized_on_wire'

Typing to dump my structure to a dict for validation and I get

AttributeError: type object 'Emi3LocationInfoType' has no attribute '_serialized_on_wire' 
.venv/lib/python3.8/site-packages/betterproto/__init__.py:732: AttributeError

Here is the generated class

@dataclass
class Emi3LocationInfoType(betterproto.Message):
    """
    Contains full location information about station: name, description, type,
    timezone, address, coordinates, owner.
    """

    # Geographical coordinate information giving the precise location.
    geo_coordinate: "Emi3GeoCoordinate" = betterproto.message_field(1)
    # Detailed address of the location.
    address: "Emi3AddressType" = betterproto.message_field(2)
    # Official name of the location.
    name: "Emi3MultilingualText" = betterproto.message_field(3)
    # Location owner name.
    property_owner: "Emi3MultilingualText" = betterproto.message_field(4)
    # Geographical coordinate for parking entrances. This field should be omitted
    # if a separately identifiable parking entrance does not exist or the
    # entrance is the same as the location of the charging station.
    parking_entrance_geo_coordinate: "Emi3GeoCoordinate" = betterproto.message_field(5)
    # Specific details about the surrounding environment (restriction).
    description: "Emi3MultilingualText" = betterproto.message_field(6)
    # The general type of the charge point location, like onstreet, underground
    # garage etc.
    location_type: "Emi3LocationInfoTypeEmi3LocationTypeEnum" = betterproto.enum_field(
        7
    )
    # The time zone of the location.
    time_zone: "Emi3TimeZone" = betterproto.message_field(8)

Switch pipenv for poetry

Poetry is a popular alternative to pipenv that also abstracts over dependency and build management using pyproject.toml, but arguably does a better job, particularly in terms of simplifying the process.

I've seen several cases (in addition to my own experience) of contributors on this repo having some difficulty with pipenv, and I think switching to poetry would make the process smoother thus lowering this barrier a little.

5 minutes googling presents a fairly one sides picture of the relative benefits of poetry vs pipenv:

I'd be happy to work on migrating the project to poetry if there are no objections. @danielgtaylor

Parsing a protobuf FromString raises KeyError: 'uint32'

Hello, I am trying to parse a protobuf that was received from a websocket.

@dataclass
class CMsgClientLogon(betterproto.Message):
    protocol_version: int = betterproto.uint32_field(1)
    deprecated_obfustucated_private_ip: int = betterproto.uint32_field(2)
    cell_id: int = betterproto.uint32_field(3)
    last_session_id: int = betterproto.uint32_field(4)
    client_package_version: int = betterproto.uint32_field(5)
    client_language: str = betterproto.string_field(6)
    client_os_type: int = betterproto.uint32_field(7)
    should_remember_password: bool = betterproto.bool_field(8)
    wine_version: str = betterproto.string_field(9)
    deprecated_10: int = betterproto.uint32_field(10)
    obfuscated_private_ip: "CMsgIPAddress" = betterproto.message_field(11)
    deprecated_public_ip: int = betterproto.uint32_field(20)
    qos_level: int = betterproto.uint32_field(21)
    client_supplied_steam_id: float = betterproto.fixed64_field(22)
    public_ip: "CMsgIPAddress" = betterproto.message_field(23)
    machine_id: bytes = betterproto.bytes_field(30)
    launcher_type: int = betterproto.uint32_field(31)
    ui_mode: int = betterproto.uint32_field(32)
    chat_mode: int = betterproto.uint32_field(33)
    steam2_auth_ticket: bytes = betterproto.bytes_field(41)
    email_address: str = betterproto.string_field(42)
    rtime32_account_creation: float = betterproto.fixed32_field(43)
    account_name: str = betterproto.string_field(50)
    password: str = betterproto.string_field(51)
    game_server_token: str = betterproto.string_field(52)
    login_key: str = betterproto.string_field(60)
    was_converted_deprecated_msg: bool = betterproto.bool_field(70)
    anon_user_target_account_name: str = betterproto.string_field(80)
    resolved_user_steam_id: float = betterproto.fixed64_field(81)
    eresult_sentryfile: int = betterproto.int32_field(82)
    sha_sentryfile: bytes = betterproto.bytes_field(83)
    auth_code: str = betterproto.string_field(84)
    otp_type: int = betterproto.int32_field(85)
    otp_value: int = betterproto.uint32_field(86)
    otp_identifier: str = betterproto.string_field(87)
    steam2_ticket_request: bool = betterproto.bool_field(88)
    sony_psn_ticket: bytes = betterproto.bytes_field(90)
    sony_psn_service_id: str = betterproto.string_field(91)
    create_new_psn_linked_account_if_needed: bool = betterproto.bool_field(92)
    sony_psn_name: str = betterproto.string_field(93)
    game_server_app_id: int = betterproto.int32_field(94)
    steamguard_dont_remember_computer: bool = betterproto.bool_field(95)
    machine_name: str = betterproto.string_field(96)
    machine_name_userchosen: str = betterproto.string_field(97)
    country_override: str = betterproto.string_field(98)
    is_steam_box: bool = betterproto.bool_field(99)
    client_instance_id: int = betterproto.uint64_field(100)
    two_factor_code: str = betterproto.string_field(101)
    supports_rate_limit_response: bool = betterproto.bool_field(102)
    web_logon_nonce: str = betterproto.string_field(103)
    priority_reason: int = betterproto.int32_field(104)
    embedded_client_secret: "CMsgClientSecret" = betterproto.message_field(105)

>>> CMsgClientLogon.FromString(b"\x8a\x15\x00\x80\t\x00\x00\x00\t\xc2L'\x11\x01\x00\x10\x01\x08\xac\x80\x048\xc4\xfa\xff\xff\x0f\xa8\x01\x02\x80\x02\x04\x88\x02\x02\x92\x03\ngobot12342\xa0\x06\x00\xba\x068gvvcXnEbE1YAAAAAAAAAAAAAAAAAAAAAAwDUJifU4Ce9czLf/NN85VZV")
Traceback (most recent call last):
  File "<input>", line 1, in <module>
  File "/Users/James/PycharmProjects/steam.py/venv/lib/python3.7/site-packages/betterproto/__init__.py", line 773, in FromString
    return cls().parse(data)
  File "/Users/James/PycharmProjects/steam.py/venv/lib/python3.7/site-packages/betterproto/__init__.py", line 754, in parse
    parsed.wire_type, meta, field, parsed.value
  File "/Users/James/PycharmProjects/steam.py/venv/lib/python3.7/site-packages/betterproto/__init__.py", line 695, in _postprocess_single
    fmt = _pack_fmt(meta.proto_type)
  File "/Users/James/PycharmProjects/steam.py/venv/lib/python3.7/site-packages/betterproto/__init__.py", line 282, in _pack_fmt
    }[proto_type]
KeyError: 'uint32'

I'm not too sure what causes this issue, not too sure if it is the dataclass that I am using or a library error so I thought I better ask.

Version Info

Python Version: 3.7.5
Betterproto: 1.2.5

Thanks ๐Ÿ‘

Generate Servicer?

Currently it seems like this package is intended for gRPC clients written in Python, meaning that people who also write their gRPC servers in Python will have to generate their Python bindings the old fashioned way as well as better protos for their clients. Is it on the roadmap to generate a servicer as well? Would be nice to be able to use this package on both sides.

Nested messages in JSON

I'm getting some unexpected behavior when doing JSON serialization. Simple example:

        event = Event(
            event_id=str(uuid.uuid4()),
        )

        envelope = Envelope(
            event=event,
        )

       data = envelope.to_json().encode('utf-8')

When I go to serialize the envelope, event is not a serialized property on the output. Is this intended behavior?

Generated client cannot handle Well Known Types as RPC return values

Hi!

I'm running into some trouble returning google.protobuf.Int32Value from a unary RPC call.

This is my service:

service Go {
    // ...
    rpc StartTrainingGame (TrainingGameRequest) returns (google.protobuf.Int32Value);
}

When I call that service (await self.client.start_training_game(**kwargs)), an exception is thrown

'_SpecialForm' object has no attribute 'FromString'.

Debugging reveals the following stack frame (proto.py:54, in grpclib)

    def decode(
        self,  # <grpclib.encoding.proto.ProtoCodec object at 0x000001CE2A687BC8>
        data: bytes,  # b'\x08\x04'
        message_type: Type['IProtoMessage'],  # typing.Union[int, NoneType]
    ) -> 'IProtoMessage':
        return message_type.FromString(data)

As is visible, message_type is expected to be IProtoMessage, with a method FromString on it. However, because the Well-Known type is converted to Union[int, NoneType], this method cannot be called there.

I suspect that the current implementation handles these well-known types separately, and this works well when they are embedded in a message, but not when they are the top-level return-value from an RPC call. I intended to use Google's wrappers mainly to be able to return primitive values from RPC-calls.

The generated Client shows the following:

    async def start_training_game(...) -> Optional[int]:
        ...
        return await self._unary_unary(
            "/anagon.ai.go.Go/StartTrainingGame", request, Optional[int],
        )

I suppose that the generated code should specify the type as the python-generated equivalent of Int32Value, instead of Optional[int].

to_dict returns wrong enum fields when numbering is not consecutive

Protobuf spec file like this:

syntax = "proto3";
package message;
message TeraMessage {
	int64 timestamp = 1;
	enum MessageType {
		MESSAGE_TYPE_UNKNOWN = 0;
		MESSAGE_TYPE_ACTION_MESSAGE = 1;
		// @exclude MESSAGE_TYPE_COMMAND_MESSAGE = 2; // DEPRECATED
		MESSAGE_TYPE_CONFIG_MESSAGE = 3;
		MESSAGE_TYPE_HEARTBEAT_MESSAGE = 4;
	}
	MessageType message_type = 5;
	bytes message = 6;
}

Generates the following Python bindings:

from dataclasses import dataclass
import betterproto

class TeraMessageMessageType(betterproto.Enum):
    MESSAGE_TYPE_UNKNOWN = 0
    MESSAGE_TYPE_ACTION_MESSAGE = 1
    # NB: notice that 2 is missing
    MESSAGE_TYPE_CONFIG_MESSAGE = 3
    MESSAGE_TYPE_HEARTBEAT_MESSAGE = 4

@dataclass
class TeraMessage(betterproto.Message):
    timestamp: int = betterproto.int64_field(1)
    message_type: "TeraMessageMessageType" = betterproto.enum_field(5)
    message: bytes = betterproto.bytes_field(6)

To reproduce the bug:

>>> from my.path.message import TeraMessage, TeraMessageMessageType
>>> message = TeraMessage(message_type=TeraMessageMessageType.MESSAGE_TYPE_CONFIG_MESSAGE)
>>> message.to_dict()
{'messageType': 'MESSAGE_TYPE_HEARTBEAT_MESSAGE'}
>>> message.to_json()
'{"messageType": "MESSAGE_TYPE_HEARTBEAT_MESSAGE"}'
>>> TeraMessage().parse(bytes(message)).message_type == TeraMessageMessageType.MESSAGE_TYPE_CONFIG_MESSAGE
True

Naming bug for twice-nested messages

Hello!

I have an interesting one for you.

I have not had time to look into exactly why this occurs, thought I should report it first because I can reproduce it and N brains are better than 1.

Problem

Twice-nested messages results in type hints which reference the correct dataclass, however the innermost dataclass has prepended the name of the root message twice

Example proto file

syntax = "proto3";

message Root {
  message Parent {
    message Child {
      string foo = 1;
    }
    reserved 1;
    repeated Child child = 2;
    bool bar = 5;
  }
  string name = 1;
  Parent parent = 4;
}

Resulting output

# Generated by the protocol buffer compiler.  DO NOT EDIT!
# sources: testfile.proto
# plugin: python-betterproto
from dataclasses import dataclass
from typing import List

import betterproto


@dataclass
class Root(betterproto.Message):
    name: str = betterproto.string_field(1)
    parent: "RootParent" = betterproto.message_field(4)


@dataclass
class RootParent(betterproto.Message):
    child: List["RootParentChild"] = betterproto.message_field(2)
    bar: bool = betterproto.bool_field(5)


@dataclass
class RootRootParentChild(betterproto.Message): # <-------- "RootRoot"
    foo: str = betterproto.string_field(1)

Steps to reproduce

from pathlib import Path
from grpc_tools import protoc


protocol_buffer = '''
syntax = "proto3";

message Root {
  message Parent {
    message Child {
      string foo = 1;
    }
    reserved 1;
    repeated Child child = 2;
    bool bar = 5;
  }
  string name = 1;
  Parent parent = 4;
}
'''
testfile = Path('testfile.proto')
testfile.write_text(protocol_buffer, encoding='utf-8')

output = Path('BUILD')

proto_include = protoc.pkg_resources.resource_filename('grpc_tools', '_proto')
args = [
    __file__,
    f'--proto_path=.',
    f'--proto_path={proto_include}',
    f'--python_betterproto_out={output}',
    str(testfile)
]
protoc.main(args)

datetime support in .from_dict()

Good day, assuming I have
d = {'date_of_birth': datetime.datetime(2000)}
and a proto including
date_of_birth: Timestamp:
When proto.from_dict(d) is called, it assumes the value of date_of_birth is an ISO string and does not consider that it may already be a datetime.datetime

I refer to your __init__.py L799-803:

elif isinstance(v, datetime):
    v = datetime.fromisoformat(
        value[key].replace("Z", "+00:00")
    )
    setattr(self, field.name, v)

This should check if value[key] is already an instance of datetime.datetime and assign it directly, shouldn't it?

Recursive message support

I am working on implementing a GRPC client which has a message such as this:

message Failure {
    ...
    Failure cause = 4;
    ...
}

Unfortunately it seems like betterproto attempts to build an object graph for this and this results in the following error:

RecursionError: maximum recursion depth exceeded while calling a Python object

I was wondering whether the project is planning on supporting this.

My plan in the short run is to just fork betterproto for my project and not set a default value for sub-messages in _get_field_default.

If the maintainers have a better approach to fixing this I would gladly offer my time in submitting a PR for it.

Compiled comments are stripped of newlines

Compare line breaks in comments ๐Ÿ‘‡

// `Any` contains an arbitrary serialized protocol buffer message along with a
// URL that describes the type of the serialized message.
//
// Protobuf library provides support to pack/unpack Any values in the form
// of utility functions or additional generated methods of the Any type.
message Any {
    
}
@dataclass
class Any(betterproto.Message):
    """
    `Any` contains an arbitrary serialized protocol buffer message along with a
    URL that describes the type of the serialized message. Protobuf library
    provides support to pack/unpack Any values in the form of utility functions
    or additional generated methods of the Any type. 
    """

Option for generating code without async?

I have some infrastructure developed around the previously gen'd grpc code, however I would really like to adopt betterproto for all its extra pythonic features.

The only hurdle i'm facing is that the async implementation is turned on by default. Is there a way to turn this off and not have the async gen code?

Messages should allow fields that are Python keywords

The following field-names should be allowed in messages.

From python 3.7.6

False
None
True
and
as
assert
async
await
break
class
continue
def
del
elif
else
except
finally
for
from
global
if
import
in
is
lambda
nonlocal
not
or
pass
raise
return
try
while
with
yield

From this list, currently only False, None and True are not supported

Saying hello

Howdy.

I worked on a project today, and noticed I am the first "dependent" for your package. So I thought I would just say hi.

I really appreciate the effort put in to make protocol buffers represented as Python objects in a way that isn't a complete mess.

Thanks for your work. I might pop around some time if I can contribute some kind of improvement.

๐Ÿ’Ÿ

to_dict() method converts integer field values to strings

When using the to_dict method of a message, integer fields are converted to strings. This can be see on this line in the source code.

Why are integers converted to strings, while all other types are left as is? Is this a bug that i can open a PR for, or it deliberate? Any help is much appreciated as this seems like a bit odd behaviour

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.