Giter Site home page Giter Site logo

Comments (17)

graemerocher avatar graemerocher commented on August 13, 2024 1

There is no official integration micrometer-metrics/micrometer#656

So we are waiting for that to be solved

from micronaut-grpc.

gaetancollaud avatar gaetancollaud commented on August 13, 2024 1

@mr-tim I have the same question as @nichuHere How did you manage to do that ?

I've tried this but none of the ports are open :(

micronaut:
  server:
    port: 8088

endpoints:
  all:
    port: 8085

from micronaut-grpc.

graemerocher avatar graemerocher commented on August 13, 2024 1

Once the new release is out we can include metrics

from micronaut-grpc.

oehme avatar oehme commented on August 13, 2024 1

1.7.0 is out. Note that the client interceptor currently has no indication from which channel a metric came. So if you use the same interface over multiple channels, you won't be able to distinguish which hosts are slow.

from micronaut-grpc.

noam-alchemy avatar noam-alchemy commented on August 13, 2024 1

Hey, noticing that micrometer shipped this over a year ago. Any updates on this?

from micronaut-grpc.

mr-tim avatar mr-tim commented on August 13, 2024

Metrics for grpc endpoints aside - is there a way to get the endpoints that expose metrics (eg, MetricsEndpoint, PrometheusEndpoint) up and running (perhaps on another port) when using micronaut-grpc? Currently I get an HTTP2 error back if I try and access the endpoints on the port configured for grpc:

io.netty.handler.codec.http2.Http2Exception: Unexpected HTTP/1.x request: GET /prometheus 
	at io.netty.handler.codec.http2.Http2Exception.connectionError(Http2Exception.java:103)
	at io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.readClientPrefaceString(Http2ConnectionHandler.java:302)
	at io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.decode(Http2ConnectionHandler.java:239)
	at io.netty.handler.codec.http2.Http2ConnectionHandler.decode(Http2ConnectionHandler.java:438)
	at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:498)
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:437)
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:276)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:377)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:355)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:377)
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:363)
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:834)

from micronaut-grpc.

graemerocher avatar graemerocher commented on August 13, 2024

maybe in Micronaut 2.0 M1 (just released) with http/2 support enabled

from micronaut-grpc.

mr-tim avatar mr-tim commented on August 13, 2024

Thanks, will give that a try - I actually have just managed to fixed this by adding in the appropriate io.micronaut:micronaut-http dependency - this results in grpc running on 50051 alongside http on 8080.

from micronaut-grpc.

graemerocher avatar graemerocher commented on August 13, 2024

Nice

from micronaut-grpc.

nichuHere avatar nichuHere commented on August 13, 2024

Thanks, will give that a try - I actually have just managed to fixed this by adding in the appropriate io.micronaut:micronaut-http dependency - this results in grpc running on 50051 alongside http on 8080.
@mr-tim @graemerocher
Can you further explain this. My need too is to send grpc metrics info to prometheous

from micronaut-grpc.

nichuHere avatar nichuHere commented on August 13, 2024

@gaetancollaud are you trying to get the grpc server running on the specific port.
In that case, you need

grpc:
  server:
    port: 8088
    keep-alive-time: 3h

from micronaut-grpc.

gaetancollaud avatar gaetancollaud commented on August 13, 2024

@nichuHere No, I'm trying to make the http management port work. Since I use kubernetes I would like to have liveness and readiness probes.

I don't actually use the grpc server. This app is just a client.

from micronaut-grpc.

mr-tim avatar mr-tim commented on August 13, 2024

For me it was just a case of making sure that I had a dependency on the io.micronaut:micronaut-http jar from my application. Once I had that, micronaut started up with the http port open (you should be able to see this in the logs at startup).

from micronaut-grpc.

HurmuzacheCiprian avatar HurmuzacheCiprian commented on August 13, 2024

There is no official integration micrometer-metrics/micrometer#656

So we are waiting for that to be solved

Looks like this is solved now

from micronaut-grpc.

graemerocher avatar graemerocher commented on August 13, 2024

@burtbeckwith please take a look at this. Thanks

from micronaut-grpc.

graemerocher avatar graemerocher commented on August 13, 2024

This was integrated with micronaut-projects/micronaut-micrometer#321

from micronaut-grpc.

pumano avatar pumano commented on August 13, 2024

I found how to run micronaut-grpc with micronaut-management in different ports:

grpc:
  server:
    port: 8080
    
micronaut:
  server:
    port: 8333
  application:
    name: ms-example
  metrics:
    enabled: true
    export:
      prometheus:
        enabled: true
        step: PT1M
        descriptions: true

endpoints:
  prometheus:
    sensitive: false

just add io.micronaut:micronaut-http-server-netty as dependency, and set micronaut->server->port as provided above. Because you need webserver to serve metrics but micronaut-grpc-runtime have different webserver grpc-netty, that does not support http/1.1 which is needed to scraping prometheus metrics.

list of needed dependencies for prometheus:

// monitoring
runtimeOnly("io.micronaut:micronaut-http-server-netty") // http server for metrics
implementation("io.micronaut:micronaut-management") // expose metrics via endpoint
implementation("io.micronaut.micrometer:micronaut-micrometer-core") // micrometer core
implementation("io.micronaut.micrometer:micronaut-micrometer-registry-prometheus") // prometheus registry

I also confirm, micronaut-micrometer contains grpc interceptors that intercept requests and provide info out-of-the-box.

For example:

# HELP grpc_server_processing_duration_seconds The total time taken for the server to complete the call
# TYPE grpc_server_processing_duration_seconds summary
grpc_server_processing_duration_seconds_count{method="GetProjects",methodType="UNARY",service="com.projects.v7.ProjectsApi",statusCode="OK",} 0.0
grpc_server_processing_duration_seconds_sum{method="GetProjects",methodType="UNARY",service="com.projects.v7.ProjectsApi",statusCode="OK",} 0.0
grpc_server_processing_duration_seconds_count{method="GetProjects",methodType="UNARY",service="com.projects.v7.ProjectsApi",statusCode="INVALID_ARGUMENT",} 2.0
grpc_server_processing_duration_seconds_sum{method="GetProjects",methodType="UNARY",service="com.projects.v7.ProjectsApi",statusCode="INVALID_ARGUMENT",} 0.074629084
# HELP grpc_server_processing_duration_seconds_max The total time taken for the server to complete the call
# TYPE grpc_server_processing_duration_seconds_max gauge
grpc_server_processing_duration_seconds_max{method="GetProjects",methodType="UNARY",service="com.projects.v7.ProjectsApi",statusCode="OK",} 0.0
grpc_server_processing_duration_seconds_max{method="GetProjects",methodType="UNARY",service="com.projects.v7.ProjectsApi",statusCode="INVALID_ARGUMENT",} 0.067716709

That type of metrics is same as produced by grpc-spring-boot-starter.

from micronaut-grpc.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.