---
description: Developing event-driven applications with Kafka and Docker
keywords: kafka, container-supported development
title: Developing event-driven applications with Kafka and Docker
linktitle: Event-driven apps with Kafka
summary: |
This guide explains how to run Apache Kafka in Docker containers.
tags: [distributed-systems]
languages: [js]
aliases:
- /guides/use-case/kafka/
params:
time: 20 minutes
---
With the rise of microservices, event-driven architectures have become increasingly popular.
[Apache Kafka](https://kafka.apache.org/), a distributed event streaming platform, is often at the
heart of these architectures. Unfortunately, setting up and deploying your own Kafka instance for development
is often tricky. Fortunately, Docker and containers make this much easier.
In this guide, you will learn how to:
1. Use Docker to launch up a Kafka cluster
2. Connect a non-containerized app to the cluster
3. Connect a containerized app to the cluster
4. Deploy Kafka-UI to help with troubleshooting and debugging
## Prerequisites
The following prerequisites are required to follow along with this how-to guide:
- [Docker Desktop](https://www.docker.com/products/docker-desktop/)
- [Node.js](https://nodejs.org/en/download/package-manager) and [yarn](https://yarnpkg.com/)
- Basic knowledge of Kafka and Docker
## Launching Kafka
Beginning with [Kafka 3.3](https://www.confluent.io/blog/apache-kafka-3-3-0-new-features-and-updates/), the deployment of Kafka was greatly simplified by no longer requiring Zookeeper thanks to KRaft (Kafka Raft). With KRaft, setting up a Kafka instance for local development is much easier. Starting with the launch of [Kafka 3.8](https://www.confluent.io/blog/introducing-apache-kafka-3-8/), a new [kafka-native](https://hub.docker.com/r/apache/kafka-native) Docker image is now available, providing a significantly faster startup and lower memory footprint.
> [!TIP]
>
> This guide will be using the apache/kafka image, as it includes many helpful scripts to manage and work with Kafka. However, you may want to use the apache/kafka-native image, as it starts more quickly and requires fewer resources.
### Starting Kafka
Start a basic Kafka cluster by doing the following steps. This example will launch a cluster, exposing port 9092 onto the host to let a native-running application to connect to it.
1. Start a Kafka container by running the following command:
```console
$ docker run -d --name=kafka -p 9092:9092 apache/kafka
```
2. Once the image pulls, you’ll have a Kafka instance up and running within a second or two.
3. The apache/kafka image ships with several helpful scripts in the `/opt/kafka/bin` directory. Run the following command to verify the cluster is up and running and get its cluster ID:
```console
$ docker exec -ti kafka /opt/kafka/bin/kafka-cluster.sh cluster-id --bootstrap-server :9092
```
Doing so will produce output similar to the following:
```plaintext
Cluster ID: 5L6g3nShT-eMCtK--X86sw
```
4. Create a sample topic and produce (or publish) a few messages by running the following command:
```console
$ docker exec -ti kafka /opt/kafka/bin/kafka-console-producer.sh --bootstrap-server :9092 --topic demo
```
After running, you can enter a message per line. For example, enter a few messages, one per line. A few examples might be:
```plaintext
First message
```
And
```plaintext
Second message
```
Press `enter` to send the last message and then press ctrl+c when you’re done. The messages will be published to Kafka.
5. Confirm the messages were published into the cluster by consuming the messages:
```console
$ docker exec -ti kafka /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server :9092 --topic demo --from-beginning
```
You should then see your messages in the output:
```plaintext
First message
Second message
```
If you want, you can open another terminal and publish more messages and see them appear in the consumer.