dialohq / ocaml-grpc Goto Github PK
View Code? Open in Web Editor NEWgRPC library for OCaml
Home Page: https://dialohq.github.io/ocaml-grpc
License: BSD 3-Clause "New" or "Revised" License
gRPC library for OCaml
Home Page: https://dialohq.github.io/ocaml-grpc
License: BSD 3-Clause "New" or "Revised" License
I was trying out the eio routeguide example, and noticed that the server streaming was only every writing a single record to the response. I ran the lwt example and it gave the expected results of a longer stream of location records. I found that adding a yield in routeguide/src/server.ml
causes the stream to be flushed.
...
encode feature |> Writer.contents |> f;
Eio.Fiber.yield ())
...
The recently released h2 0.9.0 introduced a breaking change by splitting Body.t
into Body.Writer.t
and Body.Reader.t
: anmonteiro/ocaml-h2#165
Let's adapt to these changes and then require h2 0.9.0 in the opam file.
This syntax
ocaml-grpc/lib/grpc-lwt/client.ml
Line 55 in e2ba9f4
requires ocaml >= 4.13, if I'm not mistaken, at least I get compile errors with older compilers.
By writing
let+ handler_res = handler_res in
it works at least with ocaml 4.11. Didn't try earlier versions.
Provide a minimal implementation of compression using gzip (which is supported in Go providing a popular interoperability target). There is a general spec outlining how compression should work https://github.com/grpc/grpc/blob/master/doc/compression.md. The decompress library looks promising, it provides gzip and zlib plus it is fairly popular and implemented in pure OCaml.
Bonus feature would be to design an API such that compression is extensible and library users can provide their own compression algorithms.
Hi,
I have a server that I would like to serve a stream of response where each takes a long time to compute, and I would like a client
that gradually display those responses, as soon as they arrive.
I've modified the routeguide-lwt example in the ocaml-grpc repo with this diff:
diff -u -p -b -B -r -x .semantic.cache -x .depend -x CVS -x .hg -x .svn -x .git -x _darcs /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/ /home/pad/routeguide-lwt/src
diff -u -p -b -B -r -x .semantic.cache -x .depend -x CVS -x .hg -x .svn -x .git -x _darcs /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/server.ml /home/pad/routeguide-lwt/src/server.ml
--- /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/server.ml 2023-08-03 14:04:29.771438529 +0200
+++ /home/pad/routeguide-lwt/src/server.ml 2023-08-03 14:26:29.751509229 +0200
@@ -73,12 +73,21 @@ let calc_distance (p1 : Point.t) (p2 : P
let c = 2.0 *. atan2 (sqrt a) (sqrt (1.0 -. a)) in
Float.to_int (r *. c)
+let rec fibonacci n =
+ if n < 3 then
+ 1
+ else
+ fibonacci (n-1) + fibonacci (n-2)
+
let get_feature buffer =
let decode, encode = Service.make_service_functions RouteGuide.getFeature in
(* Decode the request. *)
let point =
Reader.create buffer |> decode |> function
- | Ok v -> v
+ | Ok v ->
+ let _ = fibonacci 40 in
+ Unix.sleepf 0.5;
+ v
| Error e ->
failwith
(Printf.sprintf "Could not decode request: %s" (Result.show_error e))
@@ -123,9 +132,13 @@ let list_features (buffer : string) (f :
let () =
List.iter
(fun (feature : Feature.t) ->
- if in_range (Option.get feature.location) rectangle then
+ if in_range (Option.get feature.location) rectangle then begin
+ let res = fibonacci 40 in
+ Printf.printf "Computed one feature, fib = %d\n" res;
+ Unix.sleepf 0.001;
+ flush stdout;
encode feature |> Writer.contents |> f
- else ())
+ end else ())
!features
in
Lwt.return Grpc.Status.(v OK)
Diff finished. Thu Aug 3 14:36:48 2023
unfortunately when I run in one terminal
$./_build/default/src/server.exe routeguide/data/route_guide_db.json
Listening on port 8080 for grpc requests
and in the other
$ ./_build/default/src/client.exe
*** SIMPLE RPC ***
RESPONSE = {{ name = "Berkshire Valley Management Area Trail, Jefferson, NJ, USA";
location = (Some { latitude = 409146138; longitude = -746188906 }) }}
*** SERVER STREAMING ***
even though the server seems to gradually compute the response (the server output progress on stdout),
the client seems blocked on SERVER_STREAMING, and need to wait for all responses to be computed to finally
print all of them at once.
Is there a way to fix the server or the client to gradually display the responses?
Is Server_streaming the right way to solve this problem? Or I need to use bidirectional streaming?
Currently we have this fluff all around the example.
There are two possible solutions:
Simply put that logic in the Connection
module in grpc-lwt
(and in grpc-async
respectively)
Develop a more robust solution implementing grpc channels, excerpt from the docs:
Channels are a key concept in gRPC. Streams in HTTP/2 enable multiple concurrent conversations on a single connection; channels extend this concept by enabling multiple streams over multiple concurrent connections. On the surface, channels provide an easy interface for users to send messages into; underneath the hood, though, an incredible amount of engineering goes into keeping these connections alive, healthy, and utilized.
More about channels:
ocaml-ci was following this project when it was on @jeffa5's account. Now that it has moved the build is no longer working. ocaml-ci is an opinionated CI system for OCaml projects using dune and opam. It provides multi-version and multi-platform builds for OCaml, many often not available on other CI platforms, that closely aligns with the opam-repository CI system.
I've created a PR ocurrent/ocaml-ci#769 to enable ocaml-ci on @dialohq. If you agree I can merge the PR and restore the CI functionality. You would need to install the GitHub App as documented here.
Please let me know if there are further questions about how to enable ocaml-ci or how ocaml-ci works.
I'm running the EIO server and client examples, and noticed that when I kill the client, the server crashes with the following error:
End_of_file Raised at Eio_linux__Low_level.readv.(fun) in file "lib_eio_linux/low_level.ml", line 158, characters 25-52
+Called from Eio_unix__Rcfd.use in file "lib_eio/unix/rcfd.ml", line 161, characters 10-14
+Re-raised at Eio_unix__Rcfd.use in file "lib_eio/unix/rcfd.ml", line 166, characters 6-41
+Called from Eio__Flow.single_read in file "lib_eio/flow.ml", line 16, characters 12-27
+Called from Gluten_eio.IO_loop.read_once.(fun) in file "eio/gluten_eio.ml", line 54, characters 10-45
+Called from Gluten_eio.IO_loop.read_once in file "eio/gluten_eio.ml", line 51, characters 4-192
+Called from Gluten_eio.IO_loop.read in file "eio/gluten_eio.ml", line 60, characters 10-31
+Called from Eio__core__Fiber.any.(fun).wrap in file "lib_eio/core/fiber.ml", line 97, characters 16-20
+Re-raised at Eio__core__Fiber.any in file "lib_eio/core/fiber.ml", line 138, characters 26-61
+Called from Gluten_eio.IO_loop.start.(fun).read_loop_step in file "eio/gluten_eio.ml", line 126, characters 21-44
+Re-raised at Eio__core__Switch.maybe_raise_exs in file "lib_eio/core/switch.ml", line 118, characters 21-56
+Called from Eio__core__Switch.run_internal in file "lib_eio/core/switch.ml", line 150, characters 4-21
+Called from Eio__core__Cancel.with_cc in file "lib_eio/core/cancel.ml", line 116, characters 8-12
+Re-raised at Eio__core__Cancel.with_cc in file "lib_eio/core/cancel.ml", line 118, characters 32-40
+Called from Dune__exe__Grpc_server.(fun).loop in file "grpc_test/grpc_server.ml", line 125, characters 12-28
I'm not sure if I'm missing somewhere I'm intended to handle this, or whether connection handling should be set up a different way to be more resilient, or if this is unexpected behavior.
following the routeguide tutorial I did:
$ git clone https://github.com/dialohq/ocaml-grpc
$ cd ocaml-grpc
$ dune exec -- routeguide-server ./examples/routeguide/data/route_guide_db.json
But then get this error:
File "examples/routeguide/src/server.ml", line 133, characters 26-42:
133 | let record_route (clock : _ Eio.Time.clock) (stream : string Seq.t) =
^^^^^^^^^^^^^^^^
Error: The type constructor Eio.Time.clock expects 0 argument(s),
but is here applied to 1 argument(s)
and with the client:
dune exec -- routeguide-client
File "examples/routeguide/src/client.ml", line 18, characters 60-66:
18 | H2_eio.Client.create_connection ~sw ~error_handler:ignore socket
^^^^^^
Error: This expression has type Eio.Net.stream_socket
but an expression was expected of type Eio.Flow.two_way
The second object type has no method close
Hello everyone.
I'm wondering why there's a status
tupled with the rpc response, as a return to Client.call
.
All the examples end up matching the result with
match response with
| Ok (response, _ok) -> ...
| Error error -> ...
As there are no examples showing a use case for this status, and all existing examples end up ignoring it, I am left wondering why it is given at all. Are there any cases where it is not the OK
status? If so, which values can it be?
This is possibly related to the hang-forever bug reported by @mbacarella in dialohq/ocaml-grpc-fork#2.
Servers can respond with an empty body and headers containing the gRPC status that would normally be packaged as trailing headers at the end of the body. E.g. if you call etcd
with a non-existing method, the response is just HTTP headers:
((":status" "200")("content-type" "application/grpc")("grpc-status" "12")
("grpc-message" "unknown method This_method_is_missing"))
And the Lwt client will hang. (Status code 12 is UNIMPLEMENTED)
Confusingly, this is called "Trailers-only" in the gRPC spec.
How to fix this:
Currently the server keeps writing even after the client closes the connection.
I'm not sure if it's even possible to detect it.
Provide a solution for the benchmarking set in https://github.com/LesnyRumcajs/grpc_bench.
That will guide further performance improvements and give us a baseline for gRPC/H2.
Here's an OCaml library for open telemetry
The details are unclear to me... i.e. how should we propagate the ids. There are also multiple propagators:
I don't understand how/if they are stitched together
I followed the instructions in https://github.com/dialohq/ocaml-grpc/blob/main/examples/routeguide-tutorial.md
and run dune exec -- routeguide-server ./examples/routeguide/data/route_guide_db.json
in one terminal and
dune exec -- routeguide-client
in another one, but as opposed to the documentation that says the client will show response from the streaming server
one second at a time, instead I get all the results at the very end
in one block.
I'm running the example from the ocaml-grpc repository, with eio 1.0 and OCaml 5.2.0
on arch linux.
opam list | grep eio
eio 1.0 Effect-based direct-style IO API for OCaml
eio_linux 1.0 Eio implementation for Linux using io-uring
eio_main 1.0 Effect-based direct-style IO mainloop for OCaml
eio_posix 1.0 Eio implementation for POSIX systems
gluten-eio 0.5.0 EIO runtime for gluten
grpc-eio 0.2.0 An Eio implementation of gRPC
h2-eio 0.11.0 EIO support for h2
[pad@thinkstation ocaml-grpc (main)]$ ocamlc -v
The OCaml compiler, version 5.2.0
Standard library directory: /home/pad/.opam/5.2.0/lib/ocaml
Hi,
Would it be possible to release grpc-eio on OPAM?
I would love to migrate out of grpc-lwt to grpc-eio but could not find an easy way to install it via opam.
I'd rather avoid vendoring the code.
The main repo sees very little activity and maintenance. There are some unanswered issues and unmerged pull requests there.
I believe we should vendor ocaml-protoc-plugin in a git subtree (+ possibly publish it as ocaml-protoc-plugin-grpc).
The reason for a subtree
is to make it easy to either split it into a separate repo or upstream the changes to issuu/ocaml-protoc-plugin
in the future.
We could obtain maintain
access to https://github.com/issuu/ocaml-protoc-plugin. If we are given access I would like to enforce our open source policies there though; the main one is: every contributor that gets their PR accepted gets write access to the repo[1]. I've been quite successful in the past at keeping my open source projects healthy even after I lost interest in them. I attribute it to this policy. Do you think it's possible @andersfugmann?
1: the exception being "typo" fixes that might be automated or spammy.
I believe in doing things fast - I think we should indeed create a subtree and vendor the repo and then when/if we are given a green light from issuu to maintain the upstream, we can remove the vendored version.
ocaml-h2 does not implement goaway so it's difficult to reasonably handle rolling upgrades of gRPC servers the client is connected to. Until that is supported I am not keen to implement a higher level client abstraction.
We could use some heuristics such as rotating the connections after some time period. It's TBD but maybe such heuristics will need to be implemented anyway to be able to handle servers that don't send goaways that way.
FWIW most common implementations (i.e. the official ones) leverage GOAWAY. The first go away on sigterm and the other one when the connection si actually shutting down.
When we can intercept GOAWAYs on the client side we will be able to mark a given connection as not reusable without disrupting existing streams on that connection for a graceful shutdown procedure.
The higher level client abstraction will also need to support the following:
Original bug report at: ClickHouse/ClickHouse#52465
Could not figure out why it is not working just by looking at Wireguard traces, it seems related to an HTTP2 behaviour that their server implem does not like… Just posting this here for reference, but I would be happy if anybody has an answer to this :)
The grpc project provides a test suite description for interoperability tests. It would be useful to provide an implementation of that for ocaml-grpc
testing against other popular grpc implementations like Go or Rust.
Requirements:
I expect following on from this there will be interoperability bugs to fix.
We don't have good docs. There's a Makefile with a doc generation script which is a good start. To automatically deploy them to https://dialohq.github.io/ocaml-grpc/ we need to run the doc generation script (preferably in a GitHub action) and then put the result from _build/default/_doc/_html
to /docs
on the main branch and commit them using the GitHub action user.
It should be as simple as a GitHub workflow like this:
name: Publish docs
on:
push:
branches:
- main
jobs:
docs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Generate docs
run: |
# Run opam install(s)
make docs
- name: Commit report
run: |
# run only if anything has changed!
# if $(git diff --quiet) returns a non zero result
git add docs
git commit -m "Update docs"
git push
I wouldn't advise contributors to run the doc generation scripts during development and committing changes because it might introduce some additional conflicts for the generated code that we really don't care about.
To make the docs nicer we should add a .mld
index page
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.