Get started with Managed Service for Apache Kafka®

Apache Kafka® is a distributed event store and stream-processing platform. It is an open-source system developed by the Apache Software Foundation and written in Java and Scala. With DoubleCloud's Managed Service for Apache Kafka®, you can effortlessly deploy an Apache Kafka® in the cloud and get a production-ready and fully managed service in just about 10 minutes.

This tutorial explains how to create a Managed Apache Kafka® cluster on DoubleCloud, create a topic, and connect to the new cluster.

If you're already familiar with Apache Kafka® and know how to configure it, refer to Create an Apache Kafka® cluster with more detailed instructions instead.

Before you begin

  1. Log in or sign up to the DoubleCloud console .


    If you're a new DoubleCloud user, this tutorial won't incur you any costs — you can use the trial period credits to test the platform, including creating fully operational clusters.

  2. Make sure you have a tool for connecting to Apache Kafka®, such as kcat (formerly known as kafkacat). To install kcat, do the following:

    Pull the kcat image available at Docker Hub. This command uses the 1.7.1 version, but you can use the latest one:

    docker pull edenhill/kcat:1.7.1

    Install kcat from your package repository:

    sudo apt install kafkacat

    For other connection options, refer to Connect to a Apache Kafka® cluster.

Step 1. Create a cluster

An Apache Kafka® cluster is one or several broker hosts that contain topics and their partitions .

To create a Managed Apache Kafka® cluster, take the following steps:

  1. Go to the Clusters page in the console and click Create cluster.

  2. Select Kafka.


    The cluster creation page contains various options that allow you to configure the cluster for your needs. If you're just testing Apache Kafka® and DoubleCloud now, you can go with the default settings that will create a fully functional cluster with minimal resource configuration. To do that, click Submit at the bottom of the page and skip to Step 2. Create a topic.

    Otherwise, if you want to learn how you can configure the cluster, continue with the following steps.

  3. Review the Provider and Region settings.

    You can create Managed Apache Kafka® clusters on AWS or Google Cloud in any of the available regions. By default, DoubleCloud preselects the region nearest to you.

  4. Review Resources.

    For this getting-started guide, the defaults are enough. However, when you create a production cluster, make sure to select more than one zone to ensure high availability.

  5. Under Basic settings, enter the cluster name, for example kafka-dev. Leave the preselected latest version.

Creating a cluster usually takes no more than seven minutes depending on the cloud provider and region. When the cluster is ready, its status changes from Creating to Alive.

Screenshot of a newly created Managed Apache Kafka® cluster in the DoubleCloud console


DoubleCloud creates the admin superuser and its password automatically. You can find the credentials in the Overview tab on the cluster information page.

Step 2. Create a topic

  1. On the cluster page, go to the Topics tab and click Create.

  2. Expand Topic settings and specify the following options:

    • Cleanup policy: Select Delete. This policy deletes log segments when the retention time or log size reaches the limit.

    • Compression type: Select Uncompressed. You don't need compression for this tutorial.

    • Retention bytes: Enter 1048576 (1 Mb in bytes).

    • Retention in ms : Enter 600000 (10 minutes in milliseconds).

    Screenshot of topic settings

  3. Under Basic Settings:

    • Name: Enter a topic name, such as first-topic.

    • Partitions: Enter 1. One partition is enough for a simple topic like the one you're creating.

    • Replication factor: Enter 1. This setting specifies how many copies of your data are created. One copy is enough for this tutorial.

    Screenshot of basic settings

  4. Click Submit.

Step 3. Connect to the cluster

  1. On the cluster page, go to the Overview tab.

  2. Take note of the connection string and user credentials under Connection stringsPublic. You need them for connecting to the cluster.

  3. Create a consumer by running the following command:

    docker run --name kcat --rm -i -t edenhill/kcat:1.7.1 
          -C \
          -b <connection_string> \
          -t <topic_name> \
          -X security.protocol=SASL_SSL \
          -X sasl.mechanisms=SCRAM-SHA-512 \
          -X sasl.username="admin" \
          -X sasl.password="<cluster_password>" \
    kcat -C \
          -b <connection_string> \
          -t <topic_name> \
          -X security.protocol=SASL_SSL \
          -X sasl.mechanisms=SCRAM-SHA-512 \
          -X sasl.username="admin" \
          -X sasl.password="<cluster_password>" \

    Replace <connection_string> and <cluster_password> with the values from the Overview tab.

    The terminal should display the following status message:

    % Reached end of topic first-topic [0] at offset 0
  4. In a separate terminal instance, create a producer and push the data with the following command:

    curl | docker run --name kcat --rm -i edenhill/kcat:1.7.1 -P \
          -b <connection_string> \
          -t <topic_name> \
          -k key \
          -X security.protocol=SASL_SSL \
          -X sasl.mechanisms=SCRAM-SHA-512 \
          -X sasl.username="<user>" \
          -X sasl.password="<password>"
    curl | kcat -P \
          -b <connection_string> \
          -t <topic_name> \
          -k key \
          -X security.protocol=SASL_SSL \
          -X sasl.mechanisms=SCRAM-SHA-512 \
          -X sasl.username="<user>" \
          -X sasl.password="<password>"
  5. If you've completed all the steps successfully, you can see the uploaded the terminal with the consumer:

          "Hit_ID": 40668,
          "Date": "2017-09-09",
          "Time_Spent": "730.875",
          "Cookie_Enabled": 0,
          "Redion_ID": 11,
          "Gender": "Female",
          "Browser": "Chrome",
          "Traffic_Source": "Social network",
          "Technology": "PC (Windows)"
    % Reached end of topic first-topic [0] at offset 1102

What's next

Now that you have learned how to create a cluster and connect to it, continue exploring the DoubleCloud platform or create a production Managed Apache Kafka® cluster for your needs.