Using Kafka

Apache Kafka is a distributed event store and stream-processing platform. The project aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.

To run Apache Kafka locally we will use three containers (Kafka, Zookeeper, Kafdrop) organized inside a file called “docker Compose”.

After starting the containers, wait a few minutes and use any Apache Kafka-compatible tool to connect using tcp port 29092. To make it easier to test in the development environment, this docker Compose file features the Kafdrop tool.

Follow the steps below to start using kafka:

  1. Clone the repository
    git clone https://github.com/devprime/kafka
  2. To start
    docker-compose up -d
  3. List the three active containers
    docker ps
  4. To finish
    docker-compose down

Starting to Configure Topics in Kafka
Devprime-based microservices automatically connect to Stream services such as Kafka, and in the examples we use some standard topics such as orderevents and paymentevents.

  1. Open Kafdrop in the browser at http://localhost:9000
  2. Go to the “Topic” option and New adding “orderevents” and then “paymentevents”
  3. Check the created topics

Kafdrop

Viewing messages in Kafka
When using the microservice, we will send events through Kafka and view them through the Kafdrop tool, clicking on the topic and then on “View Messages”.

Kafdrop

Last modified August 20, 2024 (2f9802da)