Giter Site home page Giter Site logo

pybind11_protobuf's Introduction

Pybind11 bindings for Google's Protocol Buffers

[TOC]

Overview

These adapters make Protocol Buffer message types work with Pybind11 bindings.

To use the proto messages with pybind11:

  1. Include the header file pybind11_protobuf/native_proto_caster.h in the .cc file with your bindings.
  2. Call pybind11_protobuf::ImportNativeProtoCasters(); in your PYBIND11_MODULE definition.
  3. Ensure "@com_google_protobuf//:protobuf_python" is a python dependency. When using Bazel, a common strategy is to add a python library that "wraps" the extension along with any required python dependencies.

Any arguments or return values which are a protocol buffer (including the base class, proto2::Message) will be automatically converted to python native protocol buffers.

Basic Example

#include <pybind11/pybind11.h>

#include "path/to/my/my_message.proto.h"
#include "pybind11_protobuf/native_proto_caster.h"

// In real use, these two functions would probably be defined in a python-agnostic library.
MyMessage ReturnMyMessage() { ... }
void TakeMyMessage(const MyMessage& in) { ... }

PYBIND11_MODULE(my_module, m) {
  pybind11_protobuf::ImportNativeProtoCasters();

  m.def("return_my_message", &ReturnMyMessage);
  m.def("take_my_message", &TakeMyMessage, pybind11::arg("in"));
}

C++ Native vs Python Native Types

When passing protos between C++ and Python, the native_proto_caster.h bindings will convert protobuf objects to the native type on either side.

While C++ has only one native type, Python has two native types (https://rules-proto-grpc.com/en/latest/lang/python.html):

  • --define=use_fast_cpp_protos=false (aka use_pure_python_protos)
  • --define=use_fast_cpp_protos=true

With use_pure_python_protos, protobuf objects passed between C++ and Python are always serialized/deserialized between the native C++ type and the pure Python native type. This is very safe but also slow.

With use_fast_cpp_protos, the native Python type is internally backed by the native C++ type, which unlocks various performance benefits, even when only used from Python. When passing protobuf objects between Python and C++, in certain situations the serialization/deserialization overhead can be avoided, but not universally. Fundamentally, sharing C++ native protobuf objects between C++ and Python is unsafe because C++ assumes that it has exclusive ownership and may manipulate references in a way that undermines Python's much safer ownership semantics. Because of this, sharing mutable references or pointers between C++ and Python is not allowed. However, when passing a Python protobuf object to C++, and with PYBIND11_PROTOBUF_ASSUME_FULL_ABI_COMPATIBILITY defined (see proto_cast_util.h), the bindings will share the underlying C++ native protobuf object with C++ when passed by const & or const *.

Protobuf Extensions

When use_fast_cpp_protos is in use, and protobuf extensions are involved, a well-known pitfall is that extensions are silently moved to the proto2::UnknownFieldSet when a message is deserialized in C++, but the cc_proto_library for the extensions is not linked in. The root cause is an asymmetry in the handling of Python protos vs C++ protos: when a Python proto is deserialized, both the Python descriptor pool and the C++ descriptor pool are inspected, but when a C++ proto is deserialized, only the C++ descriptor pool is inspected. Until this asymmetry is resolved, the cc_proto_library for all extensions involved must be added to the deps of the relevant pybind_library or pybind_extension, or if this is impractial, pybind11_protobuf::check_unknown_fields::ExtensionsWithUnknownFieldsPolicy::WeakEnableFallbackToSerializeParse or pybind11_protobuf::AllowUnknownFieldsFor can be used.

The pitfall is sufficiently unobvious to be a setup for regular accidents, potentially with critical consequences.

To guard against the most common type of accident, native_proto_caster.h includes a safety mechanism that raises "Proto Message has an Unknown Field" in certain situations:

  • When use_fast_cpp_protos is in use,
  • a protobuf message is returned from C++ to Python,
  • the message involves protobuf extensions (recursively),
  • and the proto2::UnknownFieldSet for the message or any of its submessages is not empty.

pybind11_protobuf::check_unknown_fields::ExtensionsWithUnknownFieldsPolicy::WeakEnableFallbackToSerializeParse is a global escape hatch trading off convenience and runtime overhead: the convenience is that it is not necessary to determine what cc_proto_library dependencies need to be added, the runtime overhead is that SerializePartialToString/ParseFromString is used for messages with unknown fields, instead of the much faster CopyFrom.

Another escape hatch is pybind11_protobuf::AllowUnknownFieldsFor, which simply disables the safety mechanism for specific message types, without a runtime overhead. This is useful for situations in which unknown fields are acceptable.

An example of a full error message generated by the safety mechanism (with lines breaks here for readability):

Proto Message of type pybind11.test.NestRepeated has an Unknown Field with
parent of type pybind11.test.BaseMessage: base_msgs.1003
(pybind11_protobuf/tests/extension_nest_repeated.proto,
pybind11_protobuf/tests/extension.proto).
Please add the required `cc_proto_library` `deps`.
Only if there is no alternative to suppressing this error, use
`pybind11_protobuf::AllowUnknownFieldsFor("pybind11.test.NestRepeated", "base_msgs");`
(Warning: suppressions may mask critical bugs.)

Note that the current implementation of the safety mechanism is a compromise solution, trading off simplicity of implementation, runtime performance, and precision. Alerting developers of new code to unknown fields is assumed to be generally helpful, but the unknown fields detection is limited to messages with extensions, to avoid the runtime overhead for the presumably much more common case that no extensions are involved. Because of this, the runtime overhead for the safety mechanism is expected to be very small.

Enumerations

Enumerations are passed and returned as integers. You may use the enum values from the native python proto module to set and check the enum values used by a bound proto enum (see tests/proto_enum_test.py for an example).

In / Out Parameters

In cases where a protocol buffer is used as an in/out parameter in C++, additional logic will be required in the wrapper. For example:

#include <pybind11/pybind11.h>

#include "path/to/my/my_message.proto.h"
#include "pybind11_protobuf/native_proto_caster.h"

void MutateMessage(MyMessage* in_out) { ... }

PYBIND11_MODULE(my_module, m) {
  pybind11_protobuf::ImportNativeProtoCasters();

  m.def("mutate_message", [](MyMessage in) {
    MutateMessage(&in);
    return in;
  },
  pybind11::arg("in"));
}

pybind11_protobuf/wrapped_proto_caster.h

TL;DR: Ignore wrapped_proto_caster.h if you can, this header was added as a migration aid before the removal of proto_casters.h.

Historic background: Due to the nature of pybind11, extension modules built using native_proto_caster.h could not interoperate with the older proto_casters.h bindings, as that would have led to C++ ODR violations. wrapped_proto_caster.h is a nearly-transparent wrapper for native_proto_caster.h to work around the ODR issue. With the migration to native_proto_caster.h now completed, wrapped_proto_caster.h is obsolete and will be removed in the future.

pybind11_protobuf's People

Contributors

acozzette avatar anandolee avatar davidtwomey avatar haoyuz avatar jerub avatar kenoslund-google avatar laramiel avatar lopsided98 avatar mchinen avatar michaelreneer avatar mizux avatar nimrod-gileadi avatar ondrasej avatar petebu avatar rickeylev avatar rwgk avatar srmainwaring avatar zacharygarrett avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pybind11_protobuf's Issues

Conversion not working for C++ api_implementation

I have a simple example that works perfectly when protoc generates native Python for protobufs but is not working correctly when using the C++ implementation (i.e. cleaning, setting export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp, then rebuilding)

The full example is here https://github.com/srmainwaring/pybind11_protobuf/tree/feature/extras.

System: macOS Big Sur 11.6.1
Python: 3.9.9 installed with brew
protoc: 3.19.1 installed with brew

The relevant portion of the stack trace is:

    msg = m.set_vector3d(msg)
TypeError: set_vector3d(): incompatible function arguments. The following argument types are supported:
    1. (msg: extras::msgs::Vector3d) -> None

Invoked with: x: 1.0
y: 2.0
z: 3.0

where the example is a variation of the basic example from the README comprising a pair of set and get functions returning and accepting a simple message.

The isinstance behaviour of returned types is also not working as expected.

Uninstalling the brew version of protobuf does not change the outcome. I expect this may be a user / configuration issue but can't spot where I'm going wrong. Any help appreciated.

proto package is not preserved

For this test I pulled the example code from
https://github.com/davidtwomey/pybind11_protobuf_example.git and updated it to use native_proto_caster + latest versions of libs
pybind_proto_example.zip
.

The issue is that the proto created in python has the type <class 'example_pb2.TestMessage'>
but the proto created in C++ has the type <class 'TestMessage'>

The module name is not preserved despite package being specified in example.proto. This breaks isinstance().

We see even more weirdness in our code with nested protos. Sometimes the nested proto includes the full module name (python-style) and sometimes it doesn't (C++ style).

Questions:

  1. Why is it happening?
  2. Is it safe?

To test:
unzip the attached code and run bazel run example

Compiler error when integrating with `pybind11 v2.4.3` project

Hello, I am trying to integrate the latest pybind11_protobuf (commit a3d93a93387af7fa57d72d56cfc0a4ba7f4a60e4) into my project which uses pybind 2.4.3 and bazel.

I am running into a compiler error on what seems to be an incompatibility between pybind11 and pybind11_protobuf. Do you have any advice for how to resolve this?

In file included from external/com_github_pybind_pybind11_protobuf/pybind11_protobuf/native_proto_caster.cc:1:
external/com_github_pybind_pybind11_protobuf/pybind11_protobuf/native_proto_caster.h:102:8: error: too many template arguments for class template 'move_only_holder_caster'
struct move_only_holder_caster<
       ^
external/pybind11/include/pybind11/cast.h:1518:8: note: template is declared here
struct move_only_holder_caster {
       ^
In file included from external/com_github_pybind_pybind11_protobuf/pybind11_protobuf/native_proto_caster.cc:1:
external/com_github_pybind_pybind11_protobuf/pybind11_protobuf/native_proto_caster.h:154:8: error: too many template arguments for class template 'copyable_holder_caster'
struct copyable_holder_caster<
       ^
external/pybind11/include/pybind11/cast.h:1438:8: note: template is declared here
struct copyable_holder_caster : public type_caster_base<type> {
       ^
2 errors generated.

Here is my bazel rule for pybind11_protobuf:

cc_library(
    name = "pybind11_protobuf",
    hdrs = glob([
        "pybind11_protobuf/*.h",
    ]),
    deps = [
        "@pybind11//:pybind11",
        "@com_google_protobuf//:protobuf",
        "@com_google_protobuf//:proto_api",
        "@com_google_absl//absl/container:flat_hash_map",
        "@com_google_absl//absl/strings",
        "@com_google_absl//absl/types:optional"
    ],
    visibility = ["//visibility:public"],
    copts=[
        "-Iexternal/pybind11/include/include",
        "-Iexternal/pybind11/include",
        "-Iexternal/com_google_protobuf/include",
        "-Iexternal/com_google_absl/include"
    ]
)

How to integrate this in a CMake project?

I am very interested in getting this to run in a CMake project and possibly contributing to make that happen.

I have this repo cloned as a submodule inside a small test project, but cannot get this to compile / run without linker errors.

This is my current setup:

Files

extern/pybind11_protobuf
CMakeLists.txt
fastproto.cpp
Person.proto

Person.proto:

syntax = "proto2";

message Person {
  required int32 id = 2;
  required string name = 1;
  optional string email = 3;
}

fastproto.cpp

#include <pybind11/pybind11.h>
#include "Person.pb.h"
#include <string>
#include "pybind11_protobuf/native_proto_caster.h"

Person get_person() {
    Person person;
    person.set_id(1);
    person.set_name(std::string{"Maximilian Nöthe"});
    person.set_email(std::string{"[email protected]"});
    return person;
}

PYBIND11_MODULE(fastproto, m) {
    pybind11_protobuf::ImportNativeProtoCasters();
    m.def("get_person", &get_person);
}

CMakeLists.txt

cmake_minimum_required(VERSION 3.17..3.22)

project(Quirc++ VERSION 0.1.0 LANGUAGES C CXX)

include(GNUInstallDirs)

find_package(Python3 REQUIRED COMPONENTS Development Interpreter)
find_package(pybind11 REQUIRED)
find_package(Protobuf REQUIRED)

protobuf_generate_cpp(PROTO_SRCS PROTO_HDRS Person.proto)

pybind11_add_module(
    fastproto
    fastproto.cpp
    ${PROTO_SRCS}
    extern/pybind11_protobuf/pybind11_protobuf/native_proto_caster.cc
    extern/pybind11_protobuf/pybind11_protobuf/proto_cast_util.cc
    extern/pybind11_protobuf/pybind11_protobuf/proto_utils.cc
)
target_include_directories(fastproto PRIVATE ${CMAKE_CURRENT_BINARY_DIR} extern/pybind11_protobuf)
target_link_libraries(fastproto PRIVATE protobuf::libprotobuf)
target_compile_features(fastproto PRIVATE cxx_std_14)
set_target_properties(fastproto PROPERTIES
    CXX_EXTENSIONS OFF
    CXX_STANDARD_REQUIRED ON
)

Or use the repo here: https://github.com/maxnoe/pybind_protobuf_test

Experimenting with this, I had several issues. However now in this version including the sources my main problem is that a protobuf header is not found:

❯ cmake --build build
-- Found pybind11: /usr/include (found version "2.9.0")
-- Configuring done
-- Generating done
-- Build files have been written to: /home/maxnoe/Projects/protobuf_pybind/build
Consolidate compiler generated dependencies of target fastproto
[ 14%] Building CXX object CMakeFiles/fastproto.dir/extern/pybind11_protobuf/pybind11_protobuf/proto_cast_util.cc.o
/home/maxnoe/Projects/protobuf_pybind/extern/pybind11_protobuf/pybind11_protobuf/proto_cast_util.cc:19:10: fatal error: python/google/protobuf/proto_api.h: No such file or directory
   19 | #include "python/google/protobuf/proto_api.h"
      |          ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
compilation terminated.
make[2]: *** [CMakeFiles/fastproto.dir/build.make:126: CMakeFiles/fastproto.dir/extern/pybind11_protobuf/pybind11_protobuf/proto_cast_util.cc.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:83: CMakeFiles/fastproto.dir/all] Error 2
make: *** [Makefile:91: all] Error 2

It seems to be this one:
https://github.com/protocolbuffers/protobuf/blob/master/python/google/protobuf/proto_api.h

But that seems not to be part of an installed protobuf.

Any help on getting this to run would be appreciated.

Python 3.11 and PyFrameObject

Has pybind11_protobuf been tested with Python 3.11 and forward?
I believe PyFrameObject has been removed from the public API and curious to know someone has tested it working with this version or later.

Tests fail if protobuf is already installed in default bazel python executable

I am encountering different behaviour in trying to run tests depending on whether protobuf is already installed in the default python executable (in my case /usr/bin/python3.8). OS = Ubuntu 20

Any help would be much appreciated! Perhaps I am missing something obvious here.

Without pre-existing protobuf py installation 🆗

When running tests without any python protobuf library the bazel run command works fine.
(Although I cannot import the extension outside of bazel run.)

# Check no protobuf py installation
pip show protobuf
$-> WARNING: Package(s) not found: protobuf
git clone https://github.com/pybind/pybind11_protobuf
cd pybind11_protobuf
bazel run pybind11_protobuf/tests:wrapped_proto_module_test

Tests run and pass ok

Running tests under Python 3.8.10: /usr/bin/python3...`
Ran 20 tests in 0.011s
... 
OK

(NOTE however that you cannot run the generated protobuf test_pb2.py file
python bazel-bin/pybind11_protobuf/tests/test_pb2.py

Traceback (most recent call last):
 File "bazel-bin/pybind11_protobuf/tests/test_pb2.py", line 21, in <module>
   create_key=_descriptor._internal_create_key,
AttributeError: module 'google.protobuf.descriptor' has no attribute '_internal_create_key'

With protobuf pre-installed (default) ❌

When I install protobuf via default pip installation before running the tests, I encounter a segmentation fault.

# Install protobuf
pip install protobuf
bazel clean
bazel run pybind11_protobuf/tests:wrapped_proto_module_test

Tests now fail with a Segmentation fault

Running tests under Python 3.8.10: /usr/bin/python3
[ RUN      ] FastProtoTest.test_call_with_none
[       OK ] FastProtoTest.test_call_with_none
[ RUN      ] FastProtoTest.test_call_with_str
[       OK ] FastProtoTest.test_call_with_str
[ RUN      ] FastProtoTest.test_equality
Fatal Python error: Segmentation fault

Current thread 0x00007fb44f774740 (most recent call first):
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_pybind11_protobuf/pybind11_protobuf/tests/wrapped_proto_module_test.py", line 76 in test_equality
  File "/usr/lib/python3.8/unittest/case.py", line 633 in _callTestMethod
  File "/usr/lib/python3.8/unittest/case.py", line 676 in run
  File "/usr/lib/python3.8/unittest/case.py", line 736 in __call__
  File "/usr/lib/python3.8/unittest/suite.py", line 122 in run
  File "/usr/lib/python3.8/unittest/suite.py", line 84 in __call__
  File "/usr/lib/python3.8/unittest/suite.py", line 122 in run
  File "/usr/lib/python3.8/unittest/suite.py", line 84 in __call__
  File "/usr/lib/python3.8/unittest/runner.py", line 176 in run
  File "/usr/lib/python3.8/unittest/main.py", line 271 in runTests
  File "/usr/lib/python3.8/unittest/main.py", line 101 in __init__
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_absl_py/absl/testing/absltest.py", line 2409 in _run_and_get_tests_result
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_absl_py/absl/testing/absltest.py", line 2438 in run_tests
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_absl_py/absl/testing/absltest.py", line 2122 in main_function
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_absl_py/absl/app.py", line 251 in _run_main
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_absl_py/absl/app.py", line 303 in run
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_absl_py/absl/testing/absltest.py", line 2124 in _run_in_app
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_absl_py/absl/testing/absltest.py", line 2007 in main
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_pybind11_protobuf/pybind11_protobuf/tests/wrapped_proto_module_test.py", line 88 in <module>

With protobuf pre-installed (cpp_implementation build flag) ❌

Another variant of protobuf installation where I specify a pip install option which forces it to fall back on a setup.py build.

# Remove previous version
pip uninstall protobuf
# Install protobuf
pip install protobuf==3.18 --install-option="--cpp_implementation"
# Check protobuf implementation
python -c "from google.protobuf.internal import api_implementation; print(api_implementation._default_implementation_type)"
$ -> cpp
bazel clean
bazel run pybind11_protobuf/tests:wrapped_proto_module_test

Tests fail

exec ${PAGER:-/usr/bin/less} "$0" || exit 1
Executing tests from //pybind11_protobuf/tests:wrapped_proto_module_test
-----------------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_pybind11_protobuf/pybind11_protobuf/tests/wrapped_proto_module_test.py", line 14, in <module>
    from pybind11_protobuf.tests import compare
  File "/home/dtwomey/.cache/bazel/_bazel_dtwomey/866346c0fc431ca03a19bd6a2c5a1c06/execroot/com_google_pybind11_protobuf/bazel-out/k8-fastbuild/bin/pybind11_protobuf/tests/wrapped_proto_module_test.runfiles/com_google_pybind11_protobuf/pybind11_protobuf/tests/compare.py", line 50, in <module>
    from google.protobuf import descriptor
  File "/home/dtwomey/.local/lib/python3.8/site-packages/google/protobuf/descriptor.py", line 47, in <module>
    from google.protobuf.pyext import _message
ImportError: /home/dtwomey/.local/lib/python3.8/site-packages/google/protobuf/pyext/_message.cpython-38-x86_64-linux-gnu.so: undefined symbol: _ZNK6google8protobuf10TextFormat21FastFieldValuePrinter19PrintMessageContentERKNS0_7MessageEiibPNS1_17BaseTextGeneratorE

Unable to use without proto_api despite `enable_pyproto_api_setting` unset

I wanted to use this in an environment where the pyext for proto_api wasn't build/installed and as enable_pyproto_api_setting is disabled by default I assumed this would work.

However the build failed already due to an unconditional dependency on @com_google_protobuf//:proto_api and include of python/google/protobuf/proto_api.h

I made a couple changes such that it builds with the version TF 2.13.0 uses: Flamefire@f49bc41

However on current main it seems to be much harder as now check_unknown_fields depends on that too which makes it look like it may not work that easily anymore.

Is there interest in getting this fixed/done? Any feedback on the feasibility of the above change/commit?

The usecase was to compile TensorFlow with a pre-installed protobuf to avoid conflicts when using potentially different versions in one environment.

Create a release package to enable adding pybind11_protobuf to Bazel Central Registry

Hi,
Could you please add a complete set of instructions on how to include and use with Bazel?

I added the following stanza to my WORKSPACE file:

git_repository(
    name = "pybind11_protobuf",
    branch = "main",
    remote = "https://github.com/pybind/pybind11_protobuf.git",
    strip_prefix = "pybind11_protobuf",
)

However, I cannot get the requirement for a com_google_protobuf dependency to work.
I believe the current practice would be to add this dep to MODULE.bazel:
https://registry.bazel.build/modules/protobuf/21.7

and then add this to the BUILD rule:
"@protobuf//:protos_python",

This however gives me this error:
no such package '@com_google_protobuf//': The repository '@com_google_protobuf' could not be resolved: Repository '@com_google_protobuf' is not defined and referenced by '@pybind11_protobuf//:native_proto_caster'

What's the correct way to get this to work?
(for future compatibility, I have filed a BCR request here bazelbuild/bazel-central-registry#1121 )

Whats the advantage of using pybind11_protobuf over just passing a serialised message string instead?

Hi!

Thanks for the great work! In my project I need to pass a protobuf message between python and a C++ function installed as a module using pybind. I am just curious as to, for such a scenario why would someone resort to using this instead of just using string arguments?

I am guessing speed could be one factor but how big do the protobuf messages need to be to really make a difference in speed?

Thanks,
Indraneel

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.