Using Kafka

Apache Kafka is a distributed event store and stream-processing platform. The project aims to provide a unified, high-throughput, low-latency platform for handling real-time data feeds.

To run Apache Kafka locally we will use three containers (Kafka, Zookeeper, Kafdrop) organized inside a file called “docker Compose”.

After starting the containers, wait a few minutes and use any Apache Kafka-compatible tool to connect using tcp port 29092. To make it easier to test in the development environment, this docker Compose file comes with the Kafdrop tool.

Follow the steps below to get started with kafka:

  1. Clone the repository
    git clone https://github.com/devprime/kafka
  2. To start
    docker-compose up -d
  3. List the three active containers
    docker ps
  4. To finish
    docker-compose down

Starting to set up topics in Kafka
Devprime-based microservices automatically connect to Stream services like Kafka, and in the examples we use some standard topics like orderevents and paymentevents.

  1. Open Kafdrop in the browser in http://localhost:9000
  2. Go to the “Topic” and New option by adding “orderevents” and then “paymentevents”
  3. Check the Topics Created

Kafdrop

Viewing messages in Kafka
When using the microservice, we will send events through Kafka and view them through the Kafdrop tool by clicking on the topic and then on “View Messages”.

Kafdrop

Last modified October 17, 2023 (e38ae05b)