Categories
Automation DevOps

Kafka Producer and Consumer in Python

Today, I’ll demo Kafka producer and consumer written in Python. We’ll see a fully working demo of producer and consumer running against Kafka in a docker-compose stack.

If you later find this article useful take a look at the disclaimer for information on how to thank me.

Introduction

I have already talked about what Kafka is and its basic producers and consumers architecture. So, I’ll not delve into the details again and will dive straight to the demo.

Kafka Producer and Consumer in Python Demo

Let’s now see a demo of Kafka producer and consumer in a docker-compose stack. The stack will include Kafka and Zookeeper.

Demo Prerequisites

Install on your local computer

  • docker
  • docker-compose
  • git

If you prefer using cloud solutions, I’d suggest using managed Docker service on Linode.

Linode is a cloud service provider recently purchased by Akamai. With this purchase, Akamai became a competitor in the cloud providers market. You can repeat this demo on your own Linode account. Create one and get 100$ credit using this link.

You can create a VM with docker and docker-compose installed using linode-cli command:

linode-cli linodes create \
  --image 'linode/ubuntu20.04' \
  --region eu-central \
  --type g6-standard-2 \
  --label docker-eu-central \
  --root_pass test \
  --booted true \
  --backups_enabled false \
  --private_ip false \
  --stackscript_id 607433 \
  --stackscript_data '{"disable_root": "No","mx":"No","spf":"No"}'

stackscript 607433 is Linode’s marketplace script for creating a docker VM with one-click:

linode-cli stackscripts view 607433
┌────────┬──────────┬──────────────────┬──────────────────────────────────────────────────────┬───────────┬─────────────────────┬─────────────────────┐
│ id     │ username │ label            │ images                                               │ is_public │ created             │ updated             │
├────────┼──────────┼──────────────────┼──────────────────────────────────────────────────────┼───────────┼─────────────────────┼─────────────────────┤
│ 607433 │ linode   │ Docker One-Click │ linode/debian10, linode/debian11, linode/ubuntu20.04 │ True      │ 2019-10-31T20:14:04 │ 2023-03-15T18:13:02 │
└────────┴──────────┴──────────────────┴──────────────────────────────────────────────────────┴───────────┴─────────────────────┴─────────────────────┘

If you prefer you can do that via Linode’s cloud manager’s UI as well.

Kafka Docker-Compose Stack

You can view Kafka stack described using docker-compose.yaml at my GitHub:

It has 4 services: zookeeper, Kafka, Kafka producer and Kafka consumer.

Let’s raise the stack:

docker-compose up --build producer

The command will pull Zookeeper’s and Kafka images. In addition, it will build the producer’s image locally. Finally Zookeeper, Kafka and producer go up in that order.

Wait till all resources are running and healthy. Note that Zookeeper starts first, then Kafka and finally Kafka producer.

Kafka Python Producer

Kafka producer is a simple FastAPI web app:

import socket
from fastapi import FastAPI
from confluent_kafka import Producer

app = FastAPI()
conf = {'bootstrap.servers': "kafka:9092",
        'client.id': socket.gethostname()}
producer = Producer(conf)

@app.post("/produce")
async def produce(key: str):
    producer.produce('my-topic', key="key", value=key)
    # producer.flush()
    return {"status": "success"}

To send a message to the producer, use below curl:

curl -X POST http://localhost:8000/produce?key=test

It will publish the message to Kafka topic my-topic

Kafka Python Consumer

Let’s now start Kafka consumer in a different console session:

docker-compose up consumer --build

Kafka consumer subscribes to topic my-topic, consumes topic’s messages and prints them to console.

Send few more sample messages to the producer and see that the consumer prints them after consumption:

consumer_1   | subscribed
consumer_1   | polled
consumer_1   | b'test'
consumer_1   | polled
consumer_1   | polled
consumer_1   | polled
consumer_1   | polled
consumer_1   | polled
consumer_1   | polled
consumer_1   | polled
consumer_1   | polled
consumer_1   | b'test'

Note that both producer and consumer’s code are using Confluent’s Kafka Python client and are based on the code from Confluent’s website.

Summary

That’s it about Python Kafka producer and consumer. As always, feel free to share.

If you found this article useful, take a look at the disclaimer for information on how to thank me.

You may also find below articles interesting:

Recommended Kafka courses on Pluralsight:

Sign up using this link to get exclusive discounts like 50% off your first month or 15% off an annual subscription)

Recommended Kafka books on Amazon.