Giter Site home page Giter Site logo

kafka-docker-compose's Introduction

The project includes running Kafka with docker-compose.

To run this project, simply do,

docker-compose up

Understanding the Docker-Compose.yml

Compose will start two services zookeeper and kafka as listed in the yml file .

The first image is zookeeper which Kafka requires to keep track of various brokers, the network topology as well as synchronizing other information. Kafka broker can talk to zookeeper and that’s all the communication zookeeper needs.

The second service is kafka itself and we are just running a single instance of it, that is to say one broker. The service listens on port 9092 which is mapped onto the same port number on the Docker Host and that’s how the service communicates with the outside world.

The second service also has a couple of environment variables.

a)KAFKA_ADVERTISED_HOST_NAME default set to localhost. This is the address at which Kafka is running, and where producers and consumers can find it. Once again, this should be the set to localhost but rather to the IP address or the hostname with this the servers can be reached in your network.

b)KAFKA_CREATE_TOPICS: Topics which you want to be created by default. Pattern includes : "topic1Name:no ofpartitions:replication factor,topic2Name:no ofpartitions:replication factor,..."

c)KAFKA_ZOOKEEPER_CONNECT: represents host ip of the zookeeper server

d)KAFKA_ADVERTISED_PORT: port where zookeeper is running

Running a simple message flow

In order for Kafka to start working, we need to create a topic within it. The producer clients can then publish streams of data (messages) to the said topic and consumers can read the said datastream, if they are subscribed to that particular topic.

To do this we need to start a interactive terminal with the Kafka container. List the containers to retrieve the kafka container’s name. For example, lets assume our container is named apache-kafka_kafka_1

$ docker ps With kafka container’s name, we can now drop inside this container.

$ docker exec -it apache-kafka_kafka_1 bash bash-4.4#

Open two such different terminals to use one as consumer and another producer.

Producer Side

In one of the prompts (the one you choose to be producer), enter the following commands: To start a producer that publishes datastream from standard input to kafka using topic mentioned in config

bash-4.4# kafka-console-producer.sh --broker-list localhost:9092 --topic UploadFile

The producer is now ready to take input from keyboard and publish it.

Consumer Side

Move on the to the second terminal connected to your kafka container. The following command starts a consumer which feeds on test topic:

$ kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic UploadFile

Back to Producer You can now type messages in the new prompt and everytime you hit return the new line is printed in the consumer prompt. For example:

Test message produced. This message gets transmitted to the consumer, through Kafka, and you can see it printed at the consumer prompt.

kafka-docker-compose's People

Contributors

vipingoyal avatar

Watchers

James Cloos avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.