namely / docker-protoc Goto Github PK
View Code? Open in Web Editor NEWDocker images for generating protocol buffer definitions
License: BSD 3-Clause "New" or "Revised" License
Docker images for generating protocol buffer definitions
License: BSD 3-Clause "New" or "Revised" License
Issue #61 has been returned again (:
can I change the source and create new PR?
Hi.
I created the api as follows:
service ServiceA {
rpc Ping (service.common.proto.MessagePing) returns (service.common.proto.MessagePong) {
option (google.api.http) = {
get: "/core/serviceA/ping/{timestamp}"
};
}
}
Then, I configed file config.yaml
backend: "localhost:8000"
cors:
allow-origin: "*"
allow-credentials:
allow-methods: "POST, GET, OPTIONS, PUT, DELETE"
allow-headers: "Accept, Content-Type, Content-Length, Accept-Encoding, X-CSRF-Token, Authorization"
proxy:
port: 9000
api-prefix: "/core"
swagger:
file: "..swagger.json"
When I test API http://localhost:9000/core/serviceA/ping/900000
by postman, the returned result is Not found.
I tried modifying the code in one place:
//mux.Handle(prefix, handlers.CustomLoggingHandler(os.Stdout, http.StripPrefix(prefix[:len(prefix)-1], addRespHeaders(cfg, gwmux)), formatter))
mux.Handle(prefix, addRespHeaders(cfg, gwmux))
And API call result was successful.
I do not know the cause of the error that the line of code generated. Can you explain help me?
https://github.com/chrusty/protoc-gen-jsonschema
I am happy to put together a PR for this as well, it should be able to follow a pattern like others, though its way of configuring options is a little weird, and I'll have to think about if that explodes the list of available options on the command.
I generate python file by this command (docker run --rm -v `pwd`:/defs namely/protoc-all:1.11 -d ../proto/ -o my_project/my_project -l python
) to a folder and then I got the following file structure:
my_project
my_project
__init__.py
a_pb2.py
a_pb2_grpc.py
__init__.py
setup.py
In the a_pb2_grpc.py file, it import a_pb2 as a__pb2
which doesn't work cus in python if you want to import a module in the same folder, you need to do from . import a_pb2 as a__pb2
. Not sure if this is a bug. By the way I'm using python 3.6.1
grpc 1.19.0 https://github.com/grpc/grpc/releases/tag/v1.19.0 was released on 2019-02-26, and the latest version of grpc is 1.19.1 https://github.com/grpc/grpc/releases/tag/v1.19.1, which was released on 2019-03-08.
Previous bump: #105
Hi
First of all, I thank you because your project helped me a lot.
When i use feature grpc-gateway, i had a problem that when using multiple services in many different proto files, grpc-gateway docker will only connect to one of those services.
Example:
I have file serviceA.proto
service ServiceA {
rpc Ping (service.common.proto.MessagePing) returns (service.common.proto.MessagePong) {
option (google.api.http) = {
get: "/core/serviceA/ping/{timestamp}"
};
}
}
And I have file serviceB.proto
service ServiceB {
rpc Ping (service.common.proto.MessagePing) returns (service.common.proto.MessagePong) {
option (google.api.http) = {
get: "/core/serviceA/ping/{timestamp}"
};
}
}
Then I run cmd:
$ docker run -v `pwd`:/defs namely/gen-grpc-gateway -f . -s ServiceA
I checked in the fun SetupMux
that only service A is registered because in cmd i only possible to configure option -s serviceA
I found a way to fix this by adding a function SetupMux(...) as follows:
func SetupMux(ctx context.Context, cfg proxyConfig) *http.ServeMux {
...
err := gw.RegisterServiceAHandlerFromEndpoint(ctx, gwmux, cfg.backend, opts)
if err != nil {
logrus.Fatalf("Could not register gateway: %v", err)
}
err = gw.RegisterServiceBHandlerFromEndpoint(ctx, gwmux, cfg.backend, opts)
if err != nil {
logrus.Fatalf("Could not register gateway: %v", err)
}
}
I have a solution to solve this problem. Can I create a pull request for this project?
@gxb5443 @mdkess
the docker image namely/gen-grpc-gateway:1.30_0 have warning Getting a deprecation warning message
could you please upgrade the protoc-gen-grpc-gateway?
in namely/gen-grpc-gateway:1.30_0 when you run
# protoc-gen-grpc-gateway -version
Version dev, commit unknown, built at unknown
so, use go get in Dockerfile may not be the best practice?
Imagine if we wanted to host all of our APIs under the /api endpoint.
A service's gRPC gateway may expose an endpoint such as /my-service/foo
. Clients would access it by going to /api/my-service/foo
. A rewrite rule at the edge proxy (i.e. NGINX) would redirect requests to /api/my-service
to the grpc-gateway.
However, with the current implementation of grpc-gateway, the grpc-gateway wouldn't understand the path.
This task is to make the grpc-gateway aware, via a configuration value, of the prefix under which it's running. That is, /my-service/foo
defined in the proto would be accessible via /api/my-service/foo
.
My code
PROTOC_MAP ?= Mmy/shared/proto/files.proto=github.com/my/shared/proto/files,Mmy/shared/proto/files.proto=github.com/my/shared/proto/files
PROTOC ?= docker run --rm -u ${shell id -u} \
-v ${PWD}:/defs namely/protoc-all:1.29_1 \
-l gogo \
--go-package-map ${PROTOC_MAP} \
--with-validator \
-o ./ \
-i ./proto
$(PROTOC) \
-d ./proto/path/to/proto/file
I think this is the problem - can not possible sets mapping for validation
docker-protoc/all/entrypoint.sh
Line 313 in 657a4d7
My quick solution is to replace local paths imports to global paths
Similar code
find ${PWD}/path/to/gen/proto/files \
-name '*.pb.validate.go' \
-exec \
sed \
-i.bak \
's/generatedlocalpath/globalpath/g' \
{} +
find ${PWD}/path/to/gen/proto/files \
-name '*.bak' \
-exec \
rm {} +
I am having issues when trying to use Google Protobufs.
I have a proto file that looks like this.
syntax = "proto3";
import "google/protobuf/timestamp.proto";
package test.protobuf;
message Auditing {
int32 createdByUserId = 1;
google.protobuf.Timestamp createdDate = 2;
int32 lastModifiedByUserId = 3;
google.protobuf.Timestamp lastModifiedDate = 4;
}
But when I run the command
docker run -v `pwd`:/defs namely/protoc-all \
-d proto/dto -o ./src/main/java \
-l java
I receive the response
google/protobuf/timestamp.proto: File not found.
auditing.proto:2:1: Import "google/protobuf/timestamp.proto" was not found or had errors.
Looking at the homepage Readme.md to this repository it says this protobuf library artifact is included in the docker image. Do I need to alter my command to include Google Protobufs?
Is there any plan to upgrade go-micro plugin to v2?
https://github.com/micro/micro/tree/master/cmd/protoc-gen-micro
When running make build
locally the build fails at about 15 minutes for the same reason the build is failing in the azure pipeline. Any idea what's going on here?
...
[CXX] Compiling third_party/abseil-cpp/absl/base/internal/spinlock_wait.cc
In file included from third_party/abseil-cpp/absl/base/internal/spinlock_wait.cc:27:
third_party/abseil-cpp/absl/base/internal/spinlock_linux.inc:17:10: fatal error: linux/futex.h: No such file or directory
#include <linux/futex.h>
^~~~~~~~~~~~~~~
compilation terminated.
make: *** [Makefile:3169: /tmp/grpc/objs/opt/third_party/abseil-cpp/absl/base/internal/spinlock_wait.o] Error 1
The command '/bin/sh -c /tmp/install-protobuf.sh ${grpc} ${grpc_java}' returned a non-zero code: 2
Passing multiple different packages in *.proto files currently fails when using protoc-gen-go.
Consequently, the -d options in gen-proto does not work with multiple packages.
/defs # entrypoint.sh -d . -l go
Generating go files for . in gen/pb-go
2018/09/06 15:15:28 protoc-gen-go: error:inconsistent package names: package_one, package_two
--go_out: protoc-gen-go: Plugin failed with status code 1.
As a workaround we can have a simple interation other each proto file and called protoc compiler on that file in the entrypoint.sh script:
for filename in ${PROTO_DIR}/*.proto; do
filename=${filename##*/} # remove path
filename=${filename%.*} # remove extension
rm -rf ${GO_OUTDIR}/${filename}
mkdir ${GO_OUTDIR}/${filename}
${PROTOC} \
--go_out=plugins=grpc:${GO_OUTDIR}/${filename} \
-I ${PROTO_DIR} ${filename}.proto
done
Not sure if it's a pull request or an issue, but was wondering how you manage this kind of thing at namely?
After the 1.18 release took place yesterday: https://github.com/grpc/grpc/releases
I've just been setting this up for my teams monorepo and have run into a slight issue when generating protos for go.
We currently have a namespaced directory structure, something along the lines of /<namespace>/<service>/*.proto
- this works great for node and ruby.
However go seems to have a maxdepth of 1 for its find, set at https://github.com/namely/docker-protoc/blob/master/all/entrypoint.sh#L338 which is causing "Missing Input file" errors.
Hi
Is there a plan to support C++ as a target language?
By default, "HTTP headers that start with 'Grpc-Metadata-' are mapped to gRPC metadata (prefixed with grpcgateway-)".
As someone building a gRPC gateway, I would like to be able to specify which headers are forwarded in the config, and change the prefixing rules.
The matching occurs in the DefaultHeaderMatcher
https://github.com/grpc-ecosystem/grpc-gateway/blob/b2423da79386df5bb9d02f5d3d5793042050a0d9/runtime/mux.go#L53
Which is set here (and can be overridden): https://github.com/grpc-ecosystem/grpc-gateway/blob/b2423da79386df5bb9d02f5d3d5793042050a0d9/runtime/mux.go#L130
This task is to
With csharp for example, you end up with
gen/pb-csharp
gen/pb-csharp/FunTimes.cs
gen/pb-csharp/FunTimesGrpc.cs
gen/pb-csharp/gateway
gen/pb-csharp/gateway/FunTimes.pb.gw.go
gen/pb-csharp/gateway/FunTimes.swagger.json
But the gateway needs the go Grpc client to work.
v1.22.0 was released a month ago
The descriptor set out is a good option but to take full advantage of the functionality it would most helpful to be able to set the --include_imports flag and the --include_source_info flag.
Currently the scripts default to outputing generated files to the current directory. Would it be possible to have the scripts output the generated files to a different directory in its path, and then map it at run time to a different folder.
something like
docker run -v pwd
:/defs -v '/some/other/path':/gen namely/protoc-ruby
Conveniently, the java_out can handle a .jar
extension and automatically jar up the .java
files for you.
https://developers.google.com/protocol-buffers/docs/reference/java-generated#invocation
When outputting Java code, the protocol buffer compiler's ability to output directly to JAR archives is particularly convenient, as many Java tools are able to read source code directly from JAR files. To output to a JAR file, simply provide an output location ending in .jar. Note that only the Java source code is placed in the archive; you must still compile it separately to produce Java class files.
However, the entrypoint always assumes that the -o
parameter is a directory:
docker-protoc/all/entrypoint.sh
Lines 125 to 127 in e9304e8
So if you invoke with a jar
parameter:
-o build/java/my.jar
It will fail with this error message:
Generating java files for proto in build/java/foo.jar
build/java/my.jar: Is a directory%
I think the solution is to check and only mkdir -p
through to the nearest directory here:
docker-protoc/all/entrypoint.sh
Lines 125 to 127 in e9304e8
Hi,
I just tried the generation of a grpc-gateway with gen-grpc-gateway:1.20_2
.
Our protobufs use a go_package
option with an url (e.g. our.domain.com/path/to/go/code
) so the Go code gets generated into a folder gen/grpc-gateway/src/gen/pb-go/our.domain.com/path/to/go/code
.
The swagger files get generated into gen/grpc-gateway/src/gen/pb-go/
directly.
So the ADD
instruction in https://github.com/namely/docker-protoc/blob/master/gwy/templates/Dockerfile.tmpl#L25 can not copy these.
Maybe directly copying from $OUT_PATH/
will work here?
https://github.com/namely/docker-protoc/blob/master/all/entrypoint.sh#L305-L308
If you provide an absolute path such as /defs/packages/build
as the output, the loop attempts to recurse infinitely, because it never matches .
.
Documentation generation doesn't recognize markdown
--with-docs FORMAT Generate documentation (FORMAT is optional - see https://github.com/pseudomuto/protoc-gen-doc#invoking-the-plugin)
according to the link above:
The format may be one of the built-in ones ( docbook, html, markdown or json) or the name of a file containing a custom Go template.
but when I try to generate java classes with documents (docker run --rm -v `pwd`:/defs namely/protoc-all -d . -o ./out/test --lint --with-docs markdown -l java
) there is an error instead of documentation:
Generating java files for . in ./out/test
2018/10/25 07:59:37 Invalid parameter: markdown
--doc_out: protoc-gen-doc: Plugin failed with status code 1.
How this could be fixed?
An issue like #146 seems to have occurred with the latest release of 1.23_0
When using the gogo
language, the --with-gateway
option doesn't work.
I get:
Generating grpc-gateway is Go specific.
Guys, who knows how to just easy add support:
https://github.com/dropbox/mypy-protobuf
I found only golang docker images in docker-protoc. Any ideas?
Are there plans to support paths=source_relative
option for bindings generation?
As implemented today, you can't pass multiple includes via a -i
parameter. Each invocation of -i
overwrites the last.
Here's the code that does that:
docker-protoc/all/entrypoint.sh
Lines 69 to 72 in 3a87206
So, invocations like this don't work
docker run ... namely/protoc-all:1.11 -i protorepo -i something_else
It would be nice if each -i
added onto the EXTRA_INCLUDES
.
I ran the docker run
command on Ubuntu 16.04. It however generates the files under root user. Is there any workaround to not generate the files under root? I believe this is because the Dockerfile doesn't specify a user and defaults to root.
Right now the gateway container logs some useful info when it boots, e.g.:
Proxying requests to gRPC service at 'companies:50051'
hit Ctrl-C to shutdown
launching http server on :80
2018/06/19 22:00:35 Creating grpc-gateway proxy with config: {companies:50051 service.swagger.json /api/companies/}
2018/06/19 22:00:35 API prefix is /api/companies/
However, it never logs anything after that. It would be great to see some info logged for each requests and response.
Currently the docker image pulls the github.com/golang/protobuf/protoc-gen-go from master:
https://github.com/namely/docker-protoc/blob/master/Dockerfile#L42
Doing this with the latest released version of protoc 3.6.1
causes:
vendor/github.com/faceit/protos-generated/push-service/v1.0.0/go/push-service.pb.go:23:11: undefined: proto.ProtoPackageIsVersion3
The issue was discussed here is quite some detail:
golang/protobuf#763
Is anyone getting arround this right now? Im about to open a PR to fix it otherwise. Following the install instructions in the README of https://github.com/golang/protobuf#installation
Thank you for adding the --with-typescript
flag in PR #125.
When using --with-typescript it seems no service type definitions are generated.
In the ts-protoc-gen docs it's mentioned to add service=true
before the OUT_DIR, like so:
--ts_out="service=true:${OUT_DIR}"
See https://github.com/improbable-eng/ts-protoc-gen#generating-grpc-service-stubs-for-use-with-grpc-web for more info.
@esilkensen as you worked on this recently do you care to extend your current implementation with this service=true
option?
Thanks in advance!
I just noticed that our CI jobs are failing with the latest build of the protoc-all docker container on Docker hub. We were using the latest tag (which is now 1.31_0) and it looks like the /opt/include directory in the container is missing files that were there in the previous container:
google/protobuf/timestamp.proto: File not found.
Import "google/protobuf/timestamp.proto" was not found or had errors.
"google.protobuf.Timestamp" is not defined.
As a short-term fix, I've tagged our container back to a previous release. Just thought you might want to know.
Thanks
I'm trying to use namely:protoc-all
to generate pb classes for python. It seems like a great tool, but I'm getting permissions errors about the python init.py module files every time I run it. Oddly, the generation seems to succeed, but then only the touch fails.
My intention is to have the generated files and directories owned by the shell user as if they'd run everything locally, so I'm using roughly this make rule:
PROJECT=$(shell git rev-parse --show-toplevel 2> /dev/null)
DIR=$(shell pwd)
DOCKER_PROTOC=namely/protoc-all:1.23_0
UID=$(shell id -u)
GID=$(shell id -g)
.PHONY: types-python
types-python:
cd proto && \
docker run \
-u $(UID):$(GID) \
-v $(DIR)/proto:/defs \
-v $(DIR)/pose_detection/protobuf:/pose_detection/protobuf \
$(DOCKER_PROTOC) -l python -f poses.proto -o /pose_detection/protobuf
...when I run the above, it succeeds in generated the python classes, but I get an error message too:
make types-python
cd proto && \
docker run \
-u 1001:1001 \
-v /home/larry/projects/pose-detection/protobuf/proto:/defs \
-v /home/larry/projects/pose-detection/protobuf/pose_detection/protobuf:/pose_detection/protobuf \
namely/protoc-all:1.23_0 -l python -f poses.proto -o /pose_detection/protobuf
touch: /pose_detection/__init__.py: Permission denied
Makefile:9: recipe for target 'types-python' failed
make: *** [types-python] Error 1
I'm assuming generating compiled protobuf classes to a local directory is a common purpose for this tool, so maybe I'm just doing something wrong? If that's the case and there's a correction, I would volunteer to update the README with the details.
Thanks,
Larry
It would be nice to add typescript support for node via the protoc-gen-ts plugin.
I'm happy to take this on as a PR.
Potential Changes to Dockerfile
RUN set -ex && apk --update --no-cache add \
...
nodejs \
nodejs-npm \
&& npm i -g ts-protoc-gen
ENV PROTOC_GEN_TS_PATH /usr/local/lib/node_modules/protoc-gen-ts
Potential Changes to all/Entrypoint.sh
case $GEN_LANG in
...
"node(-ts)*") ...
...
esac
....
if [[ $GEN_LANG == "node-ts" ]]; then
protoc \
--plugin="protoc-gen-ts=${PROTOC_GEN_TS_PATH}" \
--ts_out=$OUT_DIR \
${PROTO_FILES[@]}
fi
I tried doing docker pull namely/protoc-pypthon
and docker couldn't find it. Please build and push them to the docker repo.
As of Python 3.6, __init__.py files are not required. This task is to make it optional to generate those files. This will change the behavior around
docker-protoc/all/entrypoint.sh
Line 133 in e2dcfde
This should be done by passing a command line flag to entrypoint.sh. The default should be to generate those files, to maintain backward compatibility with existing users.
Currently only the following languages are supported.
go ruby csharp java python objc
This is just a feature request to support node
.
It looks like github.com/golang/protobuf module will be deprecated soon in favor of google.golang.org/protobuf. The dependency may need to be updated the new module.
The Go Team announced a new api for protobuf in March. Since then, a large scale migration has taken place that has left https://github.com/golang/protobuf as a legacy v1 implementation.
We are currently using namely's protoc-all docker image and would like to begin generating proto using the new v2 api, so I was wondering if you had any plans to upgrade this repository - or if we should fork and implement the changes just for us.
It will be a little tricky because the recommended install path is now:
go install google.golang.org/protobuf/cmd/protoc-gen-go
I am getting the following error message when using the language web
:
/usr/local/bin/grpc_web_plugin: program not found or is not executable
--grpc-web_out: protoc-gen-grpc-web: Plugin failed with status code 1.
I am using the image in version 1.23_0:
docker run -v `pwd`:/defs namely/protoc-all -d /defs/protos --with-docs -l web
With 1.22_1 it runs without errors.
I also found out, that the following command has the same results in X_pb.js
and X_pb.d.ts
files as web
should have. Of course it is generating also the X_grpc_pb.js
files.
docker run -v `pwd`:/defs namely/protoc-all -d /defs/protos --with-typescript --with-docs -l node
So maybe the web
language is obsolete?
With a simple file structure that looks like this:
.
└── foo.proto
If I run:
docker run -v `pwd`:/defs namely/protoc-all:1.9 -f foo.proto -l python
I get the following output:
.
├── foo.proto
└── gen
├── __init__.py <- this is unexpected, I think
└── pb_python
├── __init__.py
├── foo_pb2.py
└── foo_pb2_grpc.py
It's an empty file. Haven't done much digging yet, just wanted to get the ticket open first.
cat gen/__init__.py
# nothin'
We should add a CI hook that builds and pushes docker images to hub automatically on master merges.
Currently Dart is not supported.
Is there any plans to support dart-lang as well?
https://hub.docker.com/search/?isAutomated=0&isOfficial=0&page=1&pullCount=0&q=namely&starCount=0
that page shows that only go ruby and python are built, but your github repo shows java, objective c and c-sharp as seeming candidates. Please built at least java, if not the others. An automated system would be nice, but don't let that stop you from building and pushing at least the first version of them manually.
Today, pulling protofiles using the -d
flag specifies a max-depth of 1.
docker-protoc/all/entrypoint.sh
Line 174 in 93b2381
Is there a specific reason for this? Can it be configurable?
We have a directory structure that looks like this:
src/main/proto/company_pb
├── billing
│ └── events.proto
├── commons
│ ├── sqs
│ │ ├── sqs_attribute.proto
│ │ ├── sqs_attributes.proto
│ │ └── sqs_message.proto
│ └── timestamp.proto
So the maxdepth
flag causes the command to fail out with this error
Missing input file.
because no files are passed to the underlying method.
We would like to be able to customize the js_out
portion of this GEN_STRING
:
docker-protoc/all/entrypoint.sh
Lines 165 to 167 in 3a87206
Any thoughts on how you'd like to see that implemented?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.