Giter Site home page Giter Site logo

appmetrics / appmetrics Goto Github PK

View Code? Open in Web Editor NEW
2.2K 97.0 285.0 7.88 MB

App Metrics is an open-source and cross-platform .NET library used to record and report metrics within an application.

Home Page: https://app-metrics.io

License: Apache License 2.0

PowerShell 0.22% C# 99.75% Shell 0.03% Smalltalk 0.01%
metrics dotnetcore health-check monitoring performance dotnet instrumentation application-insights dotnet-core dotnet-standard

appmetrics's Introduction

App Metrics App Metrics

Official Site License Donate

What is App Metrics?

App Metrics is an open-source and cross-platform .NET library used to record metrics within an application. App Metrics can run on .NET Core or on the full .NET framework. App Metrics abstracts away the underlaying repository of your Metrics for example InfluxDB, Graphite, Prometheus etc, by sampling and aggregating in memory and providing extensibility points to flush metrics to a repository at a specified interval.

App Metrics provides various metric types to measure things such as the rate of requests, counting the number of user logins over time, measure the time taken to execute a database query, measure the amount of free memory and so on. Metrics types supported are Gauges, Counters, Meters, Histograms and Timers and Application Performance Indexes Apdex.

App.Metrics includes an Exponentially Forward Decaying, Sliding Window and Algorithm R reservoir implementations. For more details on reservoir sampling see the docs.

Documentation

Latest Builds, Packages & Repo Stats

Branch Azure Devops
dev Azure Devops
main AppVeyor

Visualization

Dashboards can be imported from Grafana

Grafana Web Monitoring

Grafana/InfluxDB Generic Web Dashboard Demo

Grafana OAuth2 Client Web Monitoring

Grafana/InfluxDB Generic OAuth2 Web Dashboard Demo

How to build

Azure Devops builds are triggered on commits and PRs to the dev branch

  • Install the latest .NET Core 2.x SDK
  • Run build.ps1 or build.sh in the root of the repository

How to run benchmarks

App.Metrics includes benchmarking using BenchmarkDotNet.

Two benchmark projects exist targeting App.Metrics.Core and App.Metrics.Concurrency

	cd .\src\Core\benchmarks\App.Metrics.Benchmarks.Runner
	dotnet run -c "Release" --framework netcoreapp3.1

	cd .\src\Concurrency\benchmarks\App.Metrics.Concurrency.Benchmarks.Runner
	dotnet run -c "Release" --framework netcoreapp3.1

You'll then be prompted to choose a benchmark to run which will output a markdown file with the result in directory.

You can find the benchmark results here and here.

Contributing

See the contribution guidlines for details.

Acknowledgements

Thanks for providing free open source licensing

License

This library is release under Apache 2.0 License ( see LICENSE ) Copyright (c) 2016 Allan Hardy

See LICENSE


App Metrics is based on the Metrics.NET library, and at the moment uses the same reservoir sampling code from the original library which is a port of the Java Dropwizard Metrics library.

Metrics.NET Licensed under these terms: "Metrics.NET is release under Apache 2.0 License Copyright (c) 2014 Iulian Margarintescu" see LICENSE

Dropwizard Metrics Licensed under these terms*: "Copyright (c) 2010-2013 Coda Hale, Yammer.com Published under Apache Software License 2.0, see LICENSE"

appmetrics's People

Contributors

aaronontheweb avatar adamralph avatar alexmg avatar alhardy avatar arkatufus avatar cwe1ss avatar gitfool avatar jenyayel avatar karolz-ms avatar lavkeshdwivedi avatar lecaillon avatar longbigbronzeelephantfish1 avatar lscpike avatar martinothamar avatar maximusya avatar mortengjesing avatar mts57017 avatar nicholasnoise avatar ocdi avatar pgermishuys avatar prochnowc avatar raphac avatar rast1234 avatar seif avatar snakefoot avatar stevelowe avatar suslovk avatar taffareljr avatar vhatsura avatar willgunn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

appmetrics's Issues

InfluxDB Reporting API test app

Create a sample api with the TestClock configured to confirm metric values are reported correctly when using test scripts .

Adding to sample to have a visual representation of expected beviour as well as unit tests.

Allow reporters to format metric names

Metric name formatting depends on the timeseries database reporting to, it would be good to be able to provide a way for reporters to format metric names on reporting

Apdex Implementation

https://en.wikipedia.org/wiki/Apdex

  • Should apdex be it's own Metric type or part of Histograms?
  • Just calculate an api's overall apdex or allow an apdex per histogram?
  • Separate endpoint for apdex scores?
  • Include apdex metrics in health endpoint?
  • Include apdex score per histogram in metrics endpoint or separate apdex section?

Support .NET 4.5.2

Project's are using .NET Standard can .NET 4.5.2 be supported as well in standard?

Grafana dashboard showing the difference between reservoir types

Creating sample for this to validate the expected behavior of each reservoir type on top have unit tests, results are much clearer in a graph

  • Add metrics with different types of reservoir sampling to the InfluxDB api sample
  • Create a Grafana dashboard showing each reservoir type
  • Add to documentation

Finialize all api contracts

Review the IMetrics contract to avoid breaking changes before bumping to rc.

  • the default IMetrics implementation includes measuring all metrics, seperate this by metric type and expose an api into measurement methods by metric type to allow better discoverabily and a clearer api.
  • move sampling implementation from appmetrics core project to its own, may replace some functionality in future with hdrhistogram.net and math.net packages.
  • potentially expose hit ratio and other gauge types on the meyrics api for better discover and ease of use.
  • IAdvancedMetrics can be improved in the same way as mentioned in the first point above.

Metrics Text Endpoint Middleware Never Resets the StringReporter buffer

Calling the /metrics-text endpoint appends to the previous output in the 1.0.0-beta2 build using .NET Core.

***** Start - Health Checks *****


	Health Status = {status}

	PASSED CHECKS
	DEGRADED CHECKS
	FAILED CHECKS
***** End - Health Checks *****

-- End String Reporter Report: 2016-12-22T21:41:31.0203Z --

-- Start String Reporter Report: 2016-12-22T21:41:33.2383Z --

***** Start - Environment Information *****

     Assembly Name = src
  Assembly Version = 1.0.26.0
         Host Name = platform-site-2411001161-rsmc3
        Ip Address = 10.233.71.7
        Local Time = 2016-12-22T21:41:33.2406+00:00
      Machine Name = .
                OS = debian
        OS Version = 8

        public void ConfigureServices(IServiceCollection services)
        {
            services.AddLogging();
            services.AddMetrics()
                .AddJsonSerialization()
                .AddHealthChecks()
                .AddMetricsMiddleware();
        }

        public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)
        {
            app.UseMetrics();
        }

Metrics 2.0 spec

http://metrics20.org/spec/ + http://dieter.plaetinck.be/post/monitorama-pdx-metrics20/

The benefits from the canonical list of keys include:

easier to define checks, queries, and dashboards because of metric naming consistency
easier to write tools against the more consistent way of reasoning about metrics
easier to define rollup patterns, and more flexible, since one can decide to keep particular dimensions (like cloud server id) for a short time (since they're more ephemeral and only relevant short term), and other dimensions for a longer time
Netflix's Atlas monitoring tool independently seems to arrived at the same sort of conclusions as the spec's author.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.