Bobcares

Confluent kafka docker-compose | How to?

by | Sep 17, 2022

Let’s take a closer look at Confluent kafka docker-compose. As a part of our Docker hosting support , Bobcares offers solutions to all your queries.

 

Quick Start for Apache Kafka docker-compose using Confluent Platform

 

This article shows you how to get up and running with Confluent Platform and its main components using Docker containers, and also demonstrates both the basics and capabilities of Confluent Platform.

 

You can also run an automated version of this quick start designed for Confluent Platform local installs by following the below steps.

Confluent kafka docker-compose

Step 1: Start Confluent Platform Using Docker

 

Initially download and Clone the confluentinc/cp-all-in-one GitHub repository and check out the 5.5.1-post branch.

 

git clone https://github.com/confluentinc/cp-all-in-one
cd cp-all-in-one
git checkout 5.5.1-post

 

Navigate to /cp-all-in-one/cp-all-in-one directory.

 
cd cp-all-in-one/ Copy
 

Now start the Confluent Platform specifying the -d option to run in detached mode. Make sure that a minimum of 8 GB of Docker memory resources allocated.

 
docker-compose up -d 
 

The process starts with Confluent Platform separated containers for all Confluent Platform components. Your output should resemble the following:

 
Creating network "cp-all-in-one_default" with the default driver Creating zookeeper ... done 
Creating broker ... done 
Creating schema-registry ... done 
Creating rest-proxy ... done
 Creating connect ... done 
Creating ksql-datagen ... done 
Creating ksqldb-server ... done 
Creating control-center ... done 
Creating ksqldb-cli ... done 
 

To verify that the services are up and running, run this command:

 
docker-compose ps 
 

If the state is not Up, then rerun the below command:

docker-compose up -d

 

Step 2: Create Kafka Topics

 

In this step, create Kafka by using Confluent Control Center. The Confluent Control Center provides the process for constructing and monitoring the production data pipelines and event streaming requests.

 
  • Navigate to the Control Center web interface at ” http://localhost:9021/ ” and choose your cluster.
 
  • Next Select “Topics” from the cluster submenu >> click Add a topic.
 
  • Create a topic named pageviews and then click on Create with defaults.
 
  • Now repeat the previous steps and >> create a topic named users  >> click Create with defaults.
 

Step 3: Install Kafka Connector

 

Further, you can use Kafka Connect to run a demo source connector called “kafka-connect-datagen” that creates a simple sample data for the Kafka topics pageviews and users.

 

The Kafka Connect Datagen connector was installed automatically when you started Docker Compose in the initial step1 then Download and Start Confluent Platform Using Docker.

 

Run one more instance of the Kafka Connect Datagen connector to produce Kafka data to the pageviews topic in AVRO format.

 
  • From your cluster, click Connect >> connect-default cluster >>  Add connector.
 
  • Find the DatagenConnector tile and click Connect >> Filter by type >> Sources.
 
  • Name the connector datagen-pageviews. After naming the connector, new fields will appear. Scroll down and specify configuration values:
 

               . In the Key converter class field, add: org.apache.kafka.connect.storage.StringConverter.
               . kafka.topic field, type pageviews.
               . max.interval field, type 100.
               . quickstart field, type pageviews.

 
  • Click Continue >> Review the connector configuration and click Launch.
 

[Looking for a solution to another query? We are just a click away.]

 

Conclusion

 

To sum up, you can control the center to monitor and analyze the event streaming queries, and run an automated version of this quick start designed for Confluent Platform locally.

PREVENT YOUR SERVER FROM CRASHING!

Never again lose customers to poor server speed! Let us help you.

Our server experts will monitor & maintain your server 24/7 so that it remains lightning fast and secure.

GET STARTED

2 Comments

  1. Gopinath Subbaina

    How much harddisk space required. to perform this excercise.

    Reply
    • Hiba Razak

      Hi Gopinath,
      The amount of hard disk space required to perform a Confluent Kafka Docker Compose deployment depends on various factors, such as the number of Kafka brokers,nodes, and other services included in the deployment.
      Please contact our support team through live chat if you need any assistance with us.

      Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

Never again lose customers to poor
server speed! Let us help you.