Giter Site home page Giter Site logo

ocaml-grpc's People

Contributors

acerone85 avatar ansiwen avatar doctor-pi avatar jeffa5 avatar mbacarella avatar mbarbin avatar quernd avatar tmcgilchrist avatar wokalski avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

ocaml-grpc's Issues

eio streaming example only writes one record

I was trying out the eio routeguide example, and noticed that the server streaming was only every writing a single record to the response. I ran the lwt example and it gave the expected results of a longer stream of location records. I found that adding a yield in routeguide/src/server.ml causes the stream to be flushed.

  ...
  encode feature |> Writer.contents |> f;
  Eio.Fiber.yield ())
  ...

Support for gRPC compression

Provide a minimal implementation of compression using gzip (which is supported in Go providing a popular interoperability target). There is a general spec outlining how compression should work https://github.com/grpc/grpc/blob/master/doc/compression.md. The decompress library looks promising, it provides gzip and zlib plus it is fairly popular and implemented in pure OCaml.

Bonus feature would be to design an API such that compression is extensible and library users can provide their own compression algorithms.

Server_streaming list_features not streaming gradually responses?

Hi,

I have a server that I would like to serve a stream of response where each takes a long time to compute, and I would like a client
that gradually display those responses, as soon as they arrive.

I've modified the routeguide-lwt example in the ocaml-grpc repo with this diff:

diff -u -p -b -B -r -x .semantic.cache -x .depend -x CVS -x .hg -x .svn -x .git -x _darcs /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/ /home/pad/routeguide-lwt/src
diff -u -p -b -B -r -x .semantic.cache -x .depend -x CVS -x .hg -x .svn -x .git -x _darcs /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/server.ml /home/pad/routeguide-lwt/src/server.ml
--- /home/pad/work/lang-ocaml/ocaml-grpc/examples/routeguide-lwt/src/server.ml	2023-08-03 14:04:29.771438529 +0200
+++ /home/pad/routeguide-lwt/src/server.ml	2023-08-03 14:26:29.751509229 +0200
@@ -73,12 +73,21 @@ let calc_distance (p1 : Point.t) (p2 : P
   let c = 2.0 *. atan2 (sqrt a) (sqrt (1.0 -. a)) in
   Float.to_int (r *. c)
 
+let rec fibonacci n =
+  if n < 3 then
+    1
+  else
+    fibonacci (n-1) + fibonacci (n-2)
+
 let get_feature buffer =
   let decode, encode = Service.make_service_functions RouteGuide.getFeature in
   (* Decode the request. *)
   let point =
     Reader.create buffer |> decode |> function
-    | Ok v -> v
+    | Ok v -> 
+       let _ = fibonacci 40 in
+       Unix.sleepf 0.5;
+       v
     | Error e ->
         failwith
           (Printf.sprintf "Could not decode request: %s" (Result.show_error e))
@@ -123,9 +132,13 @@ let list_features (buffer : string) (f :
   let () =
     List.iter
       (fun (feature : Feature.t) ->
-        if in_range (Option.get feature.location) rectangle then
+        if in_range (Option.get feature.location) rectangle then begin
+            let res = fibonacci 40 in
+            Printf.printf "Computed one feature, fib = %d\n" res;
+            Unix.sleepf 0.001;
+            flush stdout;
           encode feature |> Writer.contents |> f
-        else ())
+        end else ())
       !features
   in
   Lwt.return Grpc.Status.(v OK)

Diff finished.  Thu Aug  3 14:36:48 2023

unfortunately when I run in one terminal

$./_build/default/src/server.exe routeguide/data/route_guide_db.json
Listening on port 8080 for grpc requests

and in the other

$ ./_build/default/src/client.exe
*** SIMPLE RPC ***
RESPONSE = {{ name = "Berkshire Valley Management Area Trail, Jefferson, NJ, USA";
  location = (Some { latitude = 409146138; longitude = -746188906 }) }}

*** SERVER STREAMING ***

even though the server seems to gradually compute the response (the server output progress on stdout),
the client seems blocked on SERVER_STREAMING, and need to wait for all responses to be computed to finally
print all of them at once.

Is there a way to fix the server or the client to gradually display the responses?
Is Server_streaming the right way to solve this problem? Or I need to use bidirectional streaming?

Provide opinionated cruft around client side connection and stream management

Currently we have this fluff all around the example.

There are two possible solutions:

  1. Simply put that logic in the Connection module in grpc-lwt (and in grpc-async respectively)

  2. Develop a more robust solution implementing grpc channels, excerpt from the docs:

    Channels are a key concept in gRPC. Streams in HTTP/2 enable multiple concurrent conversations on a single connection; channels extend this concept by enabling multiple streams over multiple concurrent connections. On the surface, channels provide an easy interface for users to send messages into; underneath the hood, though, an incredible amount of engineering goes into keeping these connections alive, healthy, and utilized.

    More about channels:

Re-enable ocaml-ci

ocaml-ci was following this project when it was on @jeffa5's account. Now that it has moved the build is no longer working. ocaml-ci is an opinionated CI system for OCaml projects using dune and opam. It provides multi-version and multi-platform builds for OCaml, many often not available on other CI platforms, that closely aligns with the opam-repository CI system.

I've created a PR ocurrent/ocaml-ci#769 to enable ocaml-ci on @dialohq. If you agree I can merge the PR and restore the CI functionality. You would need to install the GitHub App as documented here.

Please let me know if there are further questions about how to enable ocaml-ci or how ocaml-ci works.

server crashes when the client disconnects unexpectedly

I'm running the EIO server and client examples, and noticed that when I kill the client, the server crashes with the following error:

End_of_file Raised at Eio_linux__Low_level.readv.(fun) in file "lib_eio_linux/low_level.ml", line 158, characters 25-52
+Called from Eio_unix__Rcfd.use in file "lib_eio/unix/rcfd.ml", line 161, characters 10-14
+Re-raised at Eio_unix__Rcfd.use in file "lib_eio/unix/rcfd.ml", line 166, characters 6-41
+Called from Eio__Flow.single_read in file "lib_eio/flow.ml", line 16, characters 12-27
+Called from Gluten_eio.IO_loop.read_once.(fun) in file "eio/gluten_eio.ml", line 54, characters 10-45
+Called from Gluten_eio.IO_loop.read_once in file "eio/gluten_eio.ml", line 51, characters 4-192
+Called from Gluten_eio.IO_loop.read in file "eio/gluten_eio.ml", line 60, characters 10-31
+Called from Eio__core__Fiber.any.(fun).wrap in file "lib_eio/core/fiber.ml", line 97, characters 16-20
+Re-raised at Eio__core__Fiber.any in file "lib_eio/core/fiber.ml", line 138, characters 26-61
+Called from Gluten_eio.IO_loop.start.(fun).read_loop_step in file "eio/gluten_eio.ml", line 126, characters 21-44
+Re-raised at Eio__core__Switch.maybe_raise_exs in file "lib_eio/core/switch.ml", line 118, characters 21-56
+Called from Eio__core__Switch.run_internal in file "lib_eio/core/switch.ml", line 150, characters 4-21
+Called from Eio__core__Cancel.with_cc in file "lib_eio/core/cancel.ml", line 116, characters 8-12
+Re-raised at Eio__core__Cancel.with_cc in file "lib_eio/core/cancel.ml", line 118, characters 32-40
+Called from Dune__exe__Grpc_server.(fun).loop in file "grpc_test/grpc_server.ml", line 125, characters 12-28

I'm not sure if I'm missing somewhere I'm intended to handle this, or whether connection handling should be set up a different way to be more resilient, or if this is unexpected behavior.

routeguide examples do not compile

following the routeguide tutorial I did:

$ git clone https://github.com/dialohq/ocaml-grpc
$ cd ocaml-grpc
$ dune exec -- routeguide-server ./examples/routeguide/data/route_guide_db.json

But then get this error:

File "examples/routeguide/src/server.ml", line 133, characters 26-42:
133 | let record_route (clock : _ Eio.Time.clock) (stream : string Seq.t) =
                                ^^^^^^^^^^^^^^^^
Error: The type constructor Eio.Time.clock expects 0 argument(s),
       but is here applied to 1 argument(s)

and with the client:

dune exec -- routeguide-client
File "examples/routeguide/src/client.ml", line 18, characters 60-66:
18 |   H2_eio.Client.create_connection ~sw ~error_handler:ignore socket
                                                                 ^^^^^^
Error: This expression has type Eio.Net.stream_socket
       but an expression was expected of type Eio.Flow.two_way
       The second object type has no method close

Question: purpose of the `Grpc.Status.t` returned with the `Ok` case.

Hello everyone.
I'm wondering why there's a status tupled with the rpc response, as a return to Client.call.

All the examples end up matching the result with

match response with
| Ok (response, _ok) -> ...
| Error error -> ...

As there are no examples showing a use case for this status, and all existing examples end up ignoring it, I am left wondering why it is given at all. Are there any cases where it is not the OK status? If so, which values can it be?

Support "trailers-only" responses

This is possibly related to the hang-forever bug reported by @mbacarella in dialohq/ocaml-grpc-fork#2.

Servers can respond with an empty body and headers containing the gRPC status that would normally be packaged as trailing headers at the end of the body. E.g. if you call etcd with a non-existing method, the response is just HTTP headers:

((":status" "200")("content-type" "application/grpc")("grpc-status" "12")
("grpc-message" "unknown method This_method_is_missing"))

And the Lwt client will hang. (Status code 12 is UNIMPLEMENTED)

Confusingly, this is called "Trailers-only" in the gRPC spec.

How to fix this:

  • Check for gRPC status in the headers first
  • If they're missing, wait for trailers in the body
  • Only when both are missing, apply @mbacarella's workaround (in this case, we probably dialed a non-gRPC server)

Detecting closed stream

Currently the server keeps writing even after the client closes the connection.

I'm not sure if it's even possible to detect it.

no incremental display in routeguide-client

I followed the instructions in https://github.com/dialohq/ocaml-grpc/blob/main/examples/routeguide-tutorial.md

and run dune exec -- routeguide-server ./examples/routeguide/data/route_guide_db.json
in one terminal and
dune exec -- routeguide-client
in another one, but as opposed to the documentation that says the client will show response from the streaming server
one second at a time, instead I get all the results at the very end
in one block.

I'm running the example from the ocaml-grpc repository, with eio 1.0 and OCaml 5.2.0
on arch linux.

 opam list | grep eio
eio                        1.0              Effect-based direct-style IO API for OCaml
eio_linux                  1.0              Eio implementation for Linux using io-uring
eio_main                   1.0              Effect-based direct-style IO mainloop for OCaml
eio_posix                  1.0              Eio implementation for POSIX systems
gluten-eio                 0.5.0            EIO runtime for gluten
grpc-eio                   0.2.0            An Eio implementation of gRPC
h2-eio                     0.11.0           EIO support for h2
[pad@thinkstation ocaml-grpc (main)]$ ocamlc -v
The OCaml compiler, version 5.2.0
Standard library directory: /home/pad/.opam/5.2.0/lib/ocaml

New release of grpc and grpc-eio OPAM package

Hi,

Would it be possible to release grpc-eio on OPAM?
I would love to migrate out of grpc-lwt to grpc-eio but could not find an easy way to install it via opam.
I'd rather avoid vendoring the code.

Vendor ocaml-protoc-plugin as a git subtree

Why?

The main repo sees very little activity and maintenance. There are some unanswered issues and unmerged pull requests there.

Solution

I believe we should vendor ocaml-protoc-plugin in a git subtree (+ possibly publish it as ocaml-protoc-plugin-grpc).

The reason for a subtree is to make it easy to either split it into a separate repo or upstream the changes to issuu/ocaml-protoc-plugin in the future.

An alternative

We could obtain maintain access to https://github.com/issuu/ocaml-protoc-plugin. If we are given access I would like to enforce our open source policies there though; the main one is: every contributor that gets their PR accepted gets write access to the repo[1]. I've been quite successful in the past at keeping my open source projects healthy even after I lost interest in them. I attribute it to this policy. Do you think it's possible @andersfugmann?

1: the exception being "typo" fixes that might be automated or spammy.

Which solution should we pursue?

I believe in doing things fast - I think we should indeed create a subtree and vendor the repo and then when/if we are given a green light from issuu to maintain the upstream, we can remove the vendored version.

Reusing connections on the client

ocaml-h2 does not implement goaway so it's difficult to reasonably handle rolling upgrades of gRPC servers the client is connected to. Until that is supported I am not keen to implement a higher level client abstraction.

We could use some heuristics such as rotating the connections after some time period. It's TBD but maybe such heuristics will need to be implemented anyway to be able to handle servers that don't send goaways that way.

FWIW most common implementations (i.e. the official ones) leverage GOAWAY. The first go away on sigterm and the other one when the connection si actually shutting down.

When we can intercept GOAWAYs on the client side we will be able to mark a given connection as not reusable without disrupting existing streams on that connection for a graceful shutdown procedure.

The higher level client abstraction will also need to support the following:

  1. Keepalive/ping behavior
  2. max concurrent streams
  3. Reconnects
  4. Checking readiness of the client (i.e. can it be used or not? for example the server might be down temporarily)

Write Interoperability Test Case

The grpc project provides a test suite description for interoperability tests. It would be useful to provide an implementation of that for ocaml-grpc testing against other popular grpc implementations like Go or Rust.

Requirements:

  • Scripting to download and run Go implementation
  • Implementation of LWT client / server for interop
  • Implementation of Eio client / server for interop
  • Implementation of Core client / server for interop
  • Integration with GitHub actions
  • Runnable script for local development.

I expect following on from this there will be interoperability bugs to fix.

Add doc generation infra

We don't have good docs. There's a Makefile with a doc generation script which is a good start. To automatically deploy them to https://dialohq.github.io/ocaml-grpc/ we need to run the doc generation script (preferably in a GitHub action) and then put the result from _build/default/_doc/_html to /docs on the main branch and commit them using the GitHub action user.

It should be as simple as a GitHub workflow like this:

name: Publish docs
on:
  push:
    branches:
      - main 
jobs:
  docs:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
      - name: Generate docs 
        run: |
          # Run opam install(s)
          make docs
      - name: Commit report
        run: |
          # run only if anything has changed!
          # if $(git diff --quiet) returns a non zero result
          git add docs
          git commit -m "Update docs"
          git push

I wouldn't advise contributors to run the doc generation scripts during development and committing changes because it might introduce some additional conflicts for the generated code that we really don't care about.


To make the docs nicer we should add a .mld index page

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.