Bobcares

Confluent kafka docker-compose | How to?

by | Sep 17, 2022

Let’s take a closer look at Confluent kafka docker-compose. As a part of our Docker hosting support , Bobcares offers solutions to all your queries.

 

Quick Start for Apache Kafka docker-compose using Confluent Platform

 

This article shows you how to get up and running with Confluent Platform and its main components using Docker containers, and also demonstrates both the basics and capabilities of Confluent Platform.

 

You can also run an automated version of this quick start designed for Confluent Platform local installs by following the below steps.

Confluent kafka docker-compose

Step 1: Start Confluent Platform Using Docker

 

Initially download and Clone the confluentinc/cp-all-in-one GitHub repository and check out the 5.5.1-post branch.

 

git clone https://github.com/confluentinc/cp-all-in-one
cd cp-all-in-one
git checkout 5.5.1-post

 

Navigate to /cp-all-in-one/cp-all-in-one directory.

 
cd cp-all-in-one/ Copy
 

Now start the Confluent Platform specifying the -d option to run in detached mode. Make sure that a minimum of 8 GB of Docker memory resources allocated.

 
docker-compose up -d 
 

The process starts with Confluent Platform separated containers for all Confluent Platform components. Your output should resemble the following:

 
Creating network "cp-all-in-one_default" with the default driver Creating zookeeper ... done 
Creating broker ... done 
Creating schema-registry ... done 
Creating rest-proxy ... done
 Creating connect ... done 
Creating ksql-datagen ... done 
Creating ksqldb-server ... done 
Creating control-center ... done 
Creating ksqldb-cli ... done 
 

To verify that the services are up and running, run this command:

 
docker-compose ps 
 

If the state is not Up, then rerun the below command:

docker-compose up -d

 

Step 2: Create Kafka Topics

 

In this step, create Kafka by using Confluent Control Center. The Confluent Control Center provides the process for constructing and monitoring the production data pipelines and event streaming requests.

 
  • Navigate to the Control Center web interface at ” http://localhost:9021/ ” and choose your cluster.
 
  • Next Select “Topics” from the cluster submenu >> click Add a topic.
 
  • Create a topic named pageviews and then click on Create with defaults.
 
  • Now repeat the previous steps and >> create a topic named users  >> click Create with defaults.
 

Step 3: Install Kafka Connector

 

Further, you can use Kafka Connect to run a demo source connector called “kafka-connect-datagen” that creates a simple sample data for the Kafka topics pageviews and users.

 

The Kafka Connect Datagen connector was installed automatically when you started Docker Compose in the initial step1 then Download and Start Confluent Platform Using Docker.

 

Run one more instance of the Kafka Connect Datagen connector to produce Kafka data to the pageviews topic in AVRO format.

 
  • From your cluster, click Connect >> connect-default cluster >>  Add connector.
 
  • Find the DatagenConnector tile and click Connect >> Filter by type >> Sources.
 
  • Name the connector datagen-pageviews. After naming the connector, new fields will appear. Scroll down and specify configuration values:
 

               . In the Key converter class field, add: org.apache.kafka.connect.storage.StringConverter.
               . kafka.topic field, type pageviews.
               . max.interval field, type 100.
               . quickstart field, type pageviews.

 
  • Click Continue >> Review the connector configuration and click Launch.
 

[Looking for a solution to another query? We are just a click away.]

 

Conclusion

 

To sum up, you can control the center to monitor and analyze the event streaming queries, and run an automated version of this quick start designed for Confluent Platform locally.

PREVENT YOUR SERVER FROM CRASHING!

Never again lose customers to poor server speed! Let us help you.

Our server experts will monitor & maintain your server 24/7 so that it remains lightning fast and secure.

GET STARTED

2 Comments

  1. Gopinath Subbaina

    How much harddisk space required. to perform this excercise.

    Reply
    • Hiba Razak

      Hi Gopinath,
      The amount of hard disk space required to perform a Confluent Kafka Docker Compose deployment depends on various factors, such as the number of Kafka brokers,nodes, and other services included in the deployment.
      Please contact our support team through live chat if you need any assistance with us.

      Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

Never again lose customers to poor
server speed! Let us help you.

Privacy Preference Center

Necessary

Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies.

PHPSESSID - Preserves user session state across page requests.

gdpr[consent_types] - Used to store user consents.

gdpr[allowed_cookies] - Used to store user allowed cookies.

PHPSESSID, gdpr[consent_types], gdpr[allowed_cookies]
PHPSESSID
WHMCSpKDlPzh2chML

Statistics

Statistic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.

_ga - Preserves user session state across page requests.

_gat - Used by Google Analytics to throttle request rate

_gid - Registers a unique ID that is used to generate statistical data on how you use the website.

smartlookCookie - Used to collect user device and location information of the site visitors to improve the websites User Experience.

_ga, _gat, _gid
_ga, _gat, _gid
smartlookCookie
_clck, _clsk, CLID, ANONCHK, MR, MUID, SM

Marketing

Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.

IDE - Used by Google DoubleClick to register and report the website user's actions after viewing or clicking one of the advertiser's ads with the purpose of measuring the efficacy of an ad and to present targeted ads to the user.

test_cookie - Used to check if the user's browser supports cookies.

1P_JAR - Google cookie. These cookies are used to collect website statistics and track conversion rates.

NID - Registers a unique ID that identifies a returning user's device. The ID is used for serving ads that are most relevant to the user.

DV - Google ad personalisation

_reb2bgeo - The visitor's geographical location

_reb2bloaded - Whether or not the script loaded for the visitor

_reb2bref - The referring URL for the visit

_reb2bsessionID - The visitor's RB2B session ID

_reb2buid - The visitor's RB2B user ID

IDE, test_cookie, 1P_JAR, NID, DV, NID
IDE, test_cookie
1P_JAR, NID, DV
NID
hblid
_reb2bgeo, _reb2bloaded, _reb2bref, _reb2bsessionID, _reb2buid

Security

These are essential site cookies, used by the google reCAPTCHA. These cookies use an unique identifier to verify if a visitor is human or a bot.

SID, APISID, HSID, NID, PREF
SID, APISID, HSID, NID, PREF