Chat with us, powered by LiveChat
Kafka Command Line Interface (CLI): Usage & Best Practices
Explore the comprehensive guide to mastering the Kafka Command Line Interface (CLI). This guide dives into the core functionalities, common use cases, configuration options, best practices, and troubleshooting approaches for using Kafka CLI tools. Learn how to efficiently manage Kafka resources, produce and consume messages, handle consumer groups, and monitor cluster health using the powerful set of CLI commands available in Apache Kafka.
AutoMQ Team
March 8, 2025

Overview

The Kafka Command Line Interface (CLI) is an interactive shell environment that provides developers and administrators with a powerful set of tools to manage Apache Kafka resources programmatically. As the fastest and most efficient interface for interacting with a Kafka cluster, the CLI offers essential functionality for creating and configuring topics, producing and consuming messages, managing consumer groups, and monitoring cluster health. This comprehensive guide explores the Kafka CLI's capabilities, common use cases, configuration options, best practices, and troubleshooting approaches to help you effectively leverage this versatile toolset.

Understanding Kafka CLI Tools

Kafka CLI tools consist of various shell scripts located in the /bin directory of the Kafka distribution. These scripts provide a wide range of functionality for interacting with Kafka clusters, managing topics, producing and consuming messages, and handling administrative tasks. The CLI is particularly valuable for quick testing, troubleshooting, and automation without requiring code development.

Essential Kafka CLI Commands

The following table presents the most commonly used Kafka CLI commands organized by function:

Let's examine each of these categories in more detail with their specific usage patterns.

Topic Management Commands

Topic management is one of the most common uses of the Kafka CLI. Here are detailed commands for managing Kafka topics:


## Create a topic with 3 partitions and replication factor of 1
bin/kafka-topics.sh --bootstrap-server localhost:9092 --create --topic my-topic --partitions 3 --replication-factor 1

## List all topics in the cluster
bin/kafka-topics.sh --bootstrap-server localhost:9092 --list

## Describe a specific topic
bin/kafka-topics.sh --bootstrap-server localhost:9092 --describe --topic my-topic

## Add partitions to an existing topic
bin/kafka-topics.sh --bootstrap-server localhost:9092 --alter --topic my-topic --partitions 6

## Delete a topic (if delete.topic.enable=true)
bin/kafka-topics.sh --bootstrap-server localhost:9092 --delete --topic my-topic


These commands allow administrators to create, monitor, modify, and remove topics as needed[1][2].

Producer and Consumer Commands

The CLI provides tools for producing messages to topics and consuming messages from topics:


## Start a console producer to send messages to a topic
bin/kafka-console-producer.sh --bootstrap-server localhost:9092 --topic my-topic

## Start a console consumer to read messages from a topic
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-topic

## Consume messages from the beginning of a topic
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-topic --from-beginning

## Consume messages as part of a consumer group
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic my-topic --group my-group


These commands enable interactive testing of message production and consumption, which is valuable for debugging and verification[3].

Consumer Group Management

Consumer groups can be managed and monitored using these commands:


## List all consumer groups
bin/kafka-consumer-groups.sh --bootstrap-server localhost:9092 --list

## Describe a consumer group (shows partitions, offsets, lag)
bin/kafka-consumer-groups.sh --bootstrap-server localhost:9092 --describe --group my-group

## Reset offsets for a consumer group
bin/kafka-consumer-groups.sh --bootstrap-server localhost:9092 --reset-offsets --group my-group --topic my-topic --to-earliest --execute

## Delete a consumer group
bin/kafka-consumer-groups.sh --bootstrap-server localhost:9092 --delete --group my-group


These commands help in monitoring consumer progress, diagnosing performance issues, and managing consumer offsets[2].

Common Use Cases for Kafka CLI

The Kafka CLI serves several important use cases that make it an essential tool for Kafka administrators and developers.

Testing and Verification

The CLI is ideal for quickly testing Kafka cluster functionality. For example, you can verify that messages can be successfully produced and consumed:


## Terminal 1: Start a consumer
bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test-topic

## Terminal 2: Produce test messages
bin/kafka-console-producer.sh --bootstrap-server localhost:9092 --topic test-topic


Data Backfilling

When you need to import historical data into Kafka, the console producer can read data from files:


## Import data from a file to a Kafka topic
cat data.json | bin/kafka-console-producer.sh --bootstrap-server localhost:9092 --topic my-topic


This approach is useful for one-time data imports or testing with sample datasets[3].

Shell Scripting and Automation

The Kafka CLI can be incorporated into shell scripts to automate operations, such as monitoring logs or performing scheduled administrative tasks. For example:


#!/bin/bash
while true
do
  sleep 60
  new_checksum=$(md5sum $LOGFILE | awk '{ print $1 }')
  if [ "$new_checksum" != "$checksum" ]; then
    # Produce the updated log to the security log topic
    kafka-console-producer --topic full-security-log --bootstrap-server localhost:9092 < security_events.log
  fi
done


This makes it easy to incorporate Kafka operations into broader automation workflows[3].

Configuration and Setup

Installation and Basic Setup

To use Kafka CLI tools, you need to have Apache Kafka installed:

  1. Download Kafka from the Apache Kafka website

  2. Extract the downloaded file: tar -xzf kafka_2.13-3.1.0.tgz

  3. Navigate to the Kafka directory: cd kafka_2.13-3.1.0

  4. Set up environment variables (optional but recommended):


export KAFKA_HOME=/path/to/kafka
export PATH=$PATH:$KAFKA_HOME/bin


Starting the Kafka Environment

For a basic development environment, you need to start ZooKeeper (if using ZooKeeper mode) and then Kafka:


## Start ZooKeeper (if using ZooKeeper mode)
bin/zookeeper-server-start.sh config/zookeeper.properties

## Start Kafka
bin/kafka-server-start.sh config/server.properties


Secure Connections

For secure Kafka clusters, additional configuration is needed. Common authentication methods include:

SASL Authentication


bin/kafka-topics.sh --bootstrap-server kafka:9092 --command-config client.properties --list

Where client.properties contains:


security.protocol=SASL_SSL
sasl.mechanism=PLAIN
sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username="user" password="password";

SSL Configuration


bin/kafka-console-producer.sh --bootstrap-server kafka:9093 --producer.config client-ssl.properties --topic my-topic


These security configurations ensure that CLI tools can connect to secured Kafka clusters[4].

Best Practices for Kafka CLI

General Best Practices

  1. Use scripts for repetitive tasks : Create shell scripts for common operations to ensure consistency.

  2. Set default configurations : Use configuration files with the -command-config parameter to avoid typing the same options repeatedly.

  3. Test in development first : Always test commands in a development environment before executing in production.

  4. Document commands : Maintain documentation of frequently used commands and their parameters.

Production Environment Considerations

  1. Limit direct access : Restrict access to production Kafka CLI tools to authorized administrators only.

  2. Use read-only operations : Prefer read-only operations (like -describe and -list ) when possible.

  3. Double-check destructive commands : Carefully verify commands that modify or delete data before executing them.

  4. Handle encoded messages carefully : When working with encoded messages, ensure consumers use the same schema as producers[3].

Performance Optimization

  1. Batch operations : When possible, batch related operations to minimize connections to the Kafka cluster.

  2. Be careful with-from-beginning : Avoid using this flag on large topics as it may overload the system.

  3. Use specific partitions : When debugging, specify partitions directly to limit the amount of data processed.

  4. Monitor resource usage : Keep an eye on CPU and memory usage when running resource-intensive CLI commands.

Troubleshooting Common Issues

When working with Kafka CLI, you may encounter various issues. Here are some common problems and their solutions:

Broker Connectivity Issues

Problem : Unable to connect to Kafka brokers

Solutions :

  • Verify that broker addresses in -bootstrap-server are correct

  • Check network connectivity and firewall rules

  • Ensure the Kafka brokers are running

  • Verify that security configuration matches broker settings[5]

Topic Management Issues

Problem : Topic creation failing

Solutions :

  • Check if the Kafka cluster has sufficient resources

  • Verify that topic configuration is valid

  • Ensure you have necessary permissions

  • Check if a topic with the same name already exists[5]

Consumer Group Issues

Problem : Consumer group not working properly

Solutions :

  • Use kafka-consumer-groups.sh to verify current status

  • Check consumer configurations

  • Verify permissions for the consumer group

  • Ensure the topic exists and has messages[6]

Conclusion

The Kafka CLI provides a powerful and efficient way to interact with Kafka clusters, offering essential functionality for developers and administrators. By understanding the available commands, following best practices, and knowing how to troubleshoot common issues, you can effectively leverage the CLI for various Kafka operations.

For simple tasks and administrative operations, the CLI remains the fastest and most direct approach. For more complex scenarios or when a graphical interface is preferred, alternative tools like Conduktor, Redpanda Console, or Confluent Control Center can complement the CLI experience.

Whether you're testing a new Kafka setup, troubleshooting issues, or automating operations, mastering the Kafka CLI is essential for anyone working with Kafka in development or production environments.

If you find this content helpful, you might also be interested in our product AutoMQ. AutoMQ is a cloud-native alternative to Kafka by decoupling durability to S3 and EBS. 10x Cost-Effective. No Cross-AZ Traffic Cost. Autoscale in seconds. Single-digit ms latency. AutoMQ now is source code available on github. Big Companies Worldwide are Using AutoMQ. Check the following case studies to learn more:

References

  1. Klient: A native, statically-compiled command line

  2. Getting an error when trying to consume

  3. Kafka topics not showing up with the command

  4. Need help installing Confluent Platform

  5. Do you skip the service layer?

  6. Kafka UI for AWS MSK

  7. Conduktor Platform CLI Reference

  8. Kafka Tools

  9. Kafka-acls CLI error with Confluent Cloud instance

  10. Kafka CLI Commands

  11. Apache Kafka CLI Cheat Sheet

  12. Kafka Tutorial: Kafka Cheat Sheet

  13. Kafka Server Setup and CLI Tutorial

  14. Confluent CLI Command Reference

  15. Web User Interface Tools for Kafka

  16. Redpanda Console

  17. Kafka Tutorial: Kafka Console Producer

  18. CLI tool for viewing/writing to and from Kafka

  19. MonKafka: Building a Kafka broker from scratch

  20. MSK tutorial does not seem to work

  21. How do I cleanup zombie consumer groups on Kafka?

  22. Struggling to create a S3 sink connector

  23. User pwarnock comments

  24. Do you use Kafka mostly via the CLI?

  25. Kafka CLI Cheat Sheet: The Coding Interface

  26. How to list all active connections on Kafka?

  27. What are your top frustrations with Kafka?

  28. A few starter questions: What is a good setup for...

  29. Need help with receiving messages from multiple...

  30. Messages Behind in Confluent Cloud Kafka S3 Sink

  31. 12 Kafka Best Practices: Run Kafka Like the Pros

  32. Apache Kafka CLI Tutorial

  33. Common Kafka Errors and How to Resolve Them

  34. Create Kafka Environment Using Confluent Cloud CLI

  35. Best Practices for Console Cases

  36. Redpanda vs Kafka Comparison

  37. Kafka Consumer CLI Tutorial

  38. AWS CLI Command Reference - Kafka

  39. Kafka CLI Tutorial

  40. Troubleshooting Kafka: Common Issues and Their Resolutions

  41. Install and Use Confluent CLI

  42. Confluent Cloud vs Kafka Open Source

  43. Great Resources to Learn and Master Kafka

  44. Anyone Managed to Get Knative Working with Kafka?

  45. Completely Confused About KRaft Mode Setup

  46. Needed Help for Troubleshooting AWS MSK

  47. What Big Problems with Apache Kafka Do You Have?

  48. Which Kafka Client Library Should I Use?

  49. Does Anyone Know Good Tutorials for Kafka?

  50. Great Resources to Learn and Master Kafka

  51. Kafka Introduction with CLI Commands

  52. Conduktor CTL

  53. QuestDB: Third Party Tools - Redpanda

  54. Kafkactl

  55. Getting Started with Apache Kafka

  56. Step by Step Guide to Redpanda Console for Kafka

  57. Common Kafka Commands and Core Concepts

  58. Kafka Tutorial: Consumer Configuration

  59. Kafka ETL Tool: Is There Any?

  60. Terminal UI for Kafka: KafUI

  61. State Store Data: Confluent Kafka Table

  62. How Are You All Handling Config Files?

  63. Should I Run Kafka on K8s?

  64. Properly Setting Advertised Listeners for Docker

  65. Looking for Reviews on MSK in Production

  66. What is the Right Way to Create Topics in Kafka?

  67. Is Redpanda Going to Replace Apache Kafka?

  68. Kafka Producer CLI Tutorial

  69. AWS CLI Command Reference - Kafka

  70. Kafka Post-Deployment

  71. Using Kafka with the Command Line

  72. Apache Kafka Best Practices: Security

  73. Kafka Topics CLI Tutorial

  74. Apache Kafka Quickstart

  75. Which Kafka Go Client Are You Using?

  76. How a Docker Container Connects Kafka in Local

  77. How to Start Debugging Flush

  78. Confluent Kafka Go Client

  79. Redpanda Console Configuration

  80. Do You Rely on an Enterprise-backed Kafka Version?

  81. Crates Recommendations for CLI Apps

  82. Best Practices for Kafka Clients

Table of contents

Start Your AutoMQ Journey Today

Contact us to schedule an online meeting to learn more, request PoC assistance, or arrange a demo.
扫码加微信咨询