How to create a Kafka topic using Docker Compose

1 feb 2024 3 min di lettura
How to create a Kafka topic using Docker Compose
Indice dei contenuti


Apache Kafka is a distributed streaming platform that lets you build real-time streaming data pipelines and applications. Setting up Kafka can be complex, but Docker Compose simplifies the process by defining and running multi-container Docker applications. This guide provides a step-by-step approach to creating a Kafka topic using Docker Compose, making it accessible to both developers and DevOps professionals.


Before diving into the creation process, make sure you have the following prerequisites installed on your system:

  • Docker – Provides the ability to build, deploy, and run applications using containers.
  • Docker Compose - A tool for defining and running multi-container Docker applications.

Step 1: Create a Docker compose file

The first step involves creating a docker-compose.yml file. This file defines the Kafka and Zookeeper services needed to run the Kafka instance. Zookeeper is a centralized service for maintaining configuration information, naming, distributed synchronization, and providing group services.

version: '3'
 image: wurstmeister/zookeeper
 container_name: zookeeper
 - "2181:2181"
 - kafka-net
 image: wurstmeister/kafka
 container_name: kafka
 - "9092:9092"
 KAFKA_CREATE_TOPICS: "YourTopicName:1:1"
 - kafka-net
 driver: bridge

Replace YourTopicName with the desired name for your Kafka topic. The format of the KAFKA_CREATE_TOPICS environment variable is TopicName:NumberOfPartitions:ReplicationFactor.

Step 2: Running Docker Compose

Navigate to the directory containing the docker-compose.yml file and run the following command in the terminal:

docker-compose up -d

This command will download the Docker images needed for Kafka and Zookeeper, then launch the containers in detached mode.

Step 3: Verify topic creation

To verify that the Kafka topic has been created, you can use the Kafka topics command-line tool that comes with Kafka. Run the following command to list topics to verify that the topic has been created:

docker-compose exec kafka --list --zookeeper zookeeper:2181

You should see YourTopicName listed among the topics.

Step 4: Production and consumption of messages

To further test your setup, you can produce and consume messages with the Kafka console's producer and consumer scripts.

Message production:

In the Kafka container bash, run:

docker-compose exec kafka --broker-list localhost:9092 --topic YourTopicName

After running the command, you can type messages in the console. Press Ctrl+D to send messages.

Message consumption:

Open another terminal session, access the Kafka container again and run:

docker-compose exec kafka --bootstrap-server localhost:9092 --topic YourTopicName --from-beginning

You should see the messages you produced previously.

Step 4: Create a Kafka topic (optional)

By default, docker creates topics defined with the KAFKA_CREATE_TOPICS variable in the docker-compose.yaml file. But you can still create new Kafka topics with the following command:

docker-compose exec kafka --create --topic NewTopicName --partitions 1 --replication-factor 1 --bootstrap-server kafka:9092

Replace "NewTopicName" with the name of the new topic. The above command initializes a new topic and configured with a single partition and a single replica, via a Kafka broker on port 9092.

Next, list the topics to verify that the topic was created:

docker-compose exec kafka --list --zookeeper zookeeper:2181

This will list all topics, including the ones created above.

NOTE: Due to limitations in metric names, arguments with a period ('.') or an underscore ('_') may conflict. To avoid problems it is better to use one, but not both.


You have now successfully created a Kafka topic using Docker Compose and verified its functionality by producing and consuming messages. This setup not only simplifies the process of managing Kafka, but also provides a scalable and easily reproducible environment for your streaming applications. Whether you're developing locally or deploying to a production environment, Docker Compose with Kafka offers a powerful set of tools to simplify data streaming pipelines.

Buy me a coffeeBuy me a coffee

Supportaci se ti piacciono i nostri contenuti. Grazie.

Successivamente, completa il checkout per l'accesso completo a
Bentornato! Accesso eseguito correttamente.
Ti sei abbonato con successo a
Successo! Il tuo account è completamente attivato, ora hai accesso a tutti i contenuti.
Operazione riuscita. Le tue informazioni di fatturazione sono state aggiornate.
La tua fatturazione non è stata aggiornata.