Giter Site home page Giter Site logo

oslabs-beta / kafkare Goto Github PK

View Code? Open in Web Editor NEW
83.0 4.0 6.0 7.9 MB

A Kafka monitoring dashboard

Home Page: https://www.kafkare.com/

License: MIT License

HTML 14.39% JavaScript 35.56% Dockerfile 0.09% TypeScript 46.87% CSS 3.08% Shell 0.01%
kafka-cluster kafka-monitoring-dashboards monitoring metrics dashboard

kafkare's Introduction

Kafkare

A system monitoring tool for Kafka.

Kafkare

GitHub GitHub issues GitHub last commit GitHub Repo stars

Table of Contents

  1. Features
  2. Overview
  3. Documentation and Demo
    1. Two ways to generate sample kafka data
      1. Manual data entry
      2. Streaming API
    2. Running the Dashboard Application
  4. Setup
    1. Connecting to an existing instance of Kafka
    2. Updating User Database
  5. FAQ
  6. License
  7. Authors

Features

  • Cross-platform Kafka monitoring, real-time data display desktop application
  • Metrics monitored are based on feedback from real life Kafka deployment crashes and best practices
  • Fullstack integration, leveraging user authentication to ensure only authorized members can review the dashboard

Overview

Kafkare is a cross-platform Kafka monitoring dashboard application used to oversee the health of the Kafka cluster. Several key metrics are displayed including consumer lag time, number of topics, as well as system metrics like cpu usage, and available memory. Users can register for an account and login to access the dashboard. Passwords are encrypted with Bcrypt and stored in an external database.

Documentation and Demo

Demo Setup

QUICK START

There are two ways to generate your Kafka data:

  1. Manually - For a controlled amount of data produced
  2. Using an API - For a constant stream of data produced

Running the demo kafka cluster and manually enter data


From root directory (Kafkare) go into the kafka-playground folder:

In the terminal:

Install all dependencies

npm install

Set up the docker containers, we have a prebuilt kafka cluster for the demo
export HOST_IP=$(ifconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $2 }' | cut -f2 -d: | head -n1)
docker-compose up

Run the data generator application to create topics and consumers:

npm run build-testbed
npm run testbed

You can go to your browser and enter in the address bar:

localhost:8181

to see the data generator

In the data generator, put in a new topic for the broker and submit it. Then put in the number of messages you want to produce and submit. This will create that many messages to the kafka cluster.

Running the demo kafka cluster and using the API for constant data generation


From root directory (Kafkare) go into the kafka-playground folder/streaming_data:

In the terminal:

Install all dependencies

npm install

Go back to the kafka-playground folder and install the dependencies there as well: In the terminal:

Install all dependencies

npm install

Set up the docker containers, we have a prebuilt kafka cluster for the demo
export HOST_IP=$(ifconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $2 }' | cut -f2 -d: | head -n1)
docker-compose up

Run the data streaming application to create topics and consumers:
You can start your consumer by going to a new terminal, then open the Kafkare/kafka-playground directory. Now run:
npm run consumer

Open a new terminal window. From the root directory (Kafkare) go into the kafka-playground directory. Now create the topic in Kafka by running:

npm run topic

Finally, to create the generate the stream, run in the terminal:

npm run producer

Running the dashboard

Open a new terminal. In the root directory (Kafkare), run in the terminal

npm install
npm run build
npm start

You will see a login page where you can either login with an existing account or create a new account to login with.

To create a new account, click the register button in the login page. After registering an account, you will be prompted to login in with the account.

After successfully logging in, a desktop application with the Kafka monitoring dashboards will load and you can start monitoring the running kafka cluster.

Connecting to an existing instance of Kafka

In the kafka-playground/ directory, edit the docker-compose.yml file. Add the following environment variables with relavant information to Kafka-exporter:

Environment Variable Description
KAFKA_SERVER Addresses (host:port) of Kafka server.
SASL_USERNAME SASL user name.
SASL_PASSWORD SASL user password.
  kafka_exporter:
    image: danielqsj/kafka-exporter
    ports:
      - '9308:9308'
    environment: 
      KAFKA_SERVER: <host:port>
      SASL_USERNAME: <SASL username>
      SASL_PASSWORD: <SASL password>

Connecting the user database

In the server/db/ directory, edit the db.js file. Within the file, change the value of the PG_URI variable to the postgres database you are using.

const PG_URI =
  'postgres://<user>:<password>@<host>.db.elephantsql.com:5432/<db>';

FAQ

Docker Compose Error

Q1. I'm getting this error when I use docker-compose up

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "docker-compose", line 3, in <module>
  File "compose\cli\main.py", line 67, in main
  File "compose\cli\main.py", line 123, in perform_command
  File "compose\cli\command.py", line 69, in project_from_options
  File "compose\cli\command.py", line 132, in get_project
  File "compose\cli\docker_client.py", line 43, in get_client
  File "compose\cli\docker_client.py", line 170, in docker_client
  File "site-packages\docker\api\client.py", line 188, in __init__
  File "site-packages\docker\api\client.py", line 213, in _retrieve_server_version
docker.errors.DockerException: Error while fetching server API version: (2, 'CreateFile', 'The system cannot find the file specified.')

A1. Make sure Docker Desktop is up and running.

Q2: Why doesn't Kafka doesn't start when I use docker-compose.

A2: Make sure your hostIP is defined. On iOS or Linux use:

export HOST_IP=$(ifconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $2 }' | cut -f2 -d: | head -n1)

For Windows Users use:

export HOST_IP=$(ipconfig | grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" | grep -v 127.0.0.1 | awk '{ print $14 }' | cut -f2 -d: | head -n1)

License

The JavaScript Templates script is released under the MIT license.

Authors

Jenniel Figuereo
Jiaxin Li
Joel Beger
Wai Fai Lau

kafkare's People

Contributors

jfiguereo89 avatar jtbeger avatar lijiaxingogo avatar wlau8088 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.