Skip to content

kafkactl is a command line tool to interact with an Apache Kafka cluster.

License

Notifications You must be signed in to change notification settings

fgrosse/kafkactl

Repository files navigation

Kafkactl

A command line tool to interact with an Apache Kafka cluster.


kafkactl is a command line tool for people that work with Apache Kafka. It can be used to query information (e.g., brokers, topics, messages, consumers, etc.) or create, update, and delete resources in the Kafka cluster. The command can be used to consume and produce messages and has native support for Kafka messages that are encoded using Protocol Buffers or Apache Avro. Finally, kafkactl implements more advanced behaviour on top of these primitives, e.g. to replay messages on the same or another cluster.

Installation

You can either install a pre-compiled binary or compile from source.

Pre-compiled binaries

Download the pre-compiled binaries from the releases page and copy them into your $PATH.

Compiling from source

If you have Go installed, you can fetch the latest code and compile an executable binary using the following command:

go install github.com/fgrosse/kafkactl@latest

Usage

kafkactl is intended to be used as CLI tool on your local machine. It is both useful in day to day operations on the shell as well in automation and scripting. The tool is split into multiple commands, each with its own help output. You can see each commands usage by setting the --help flag.

$ kafkactl --help       
kafkactl is a command line tool to interact with an Apache Kafka cluster.

Usage:
  kafkactl [command]

Managing configuration
  config      Manage the kafkactl configuration
  context     Switch between different configuration contexts

Resource operations
  create      Create resources in the Kafka cluster
  delete      Delete resources in the Kafka cluster
  get         Display resources in the Kafka cluster
  update      Update resources in the Kafka cluster

Consuming & Producing messages
  consume     Consume messages from a Kafka topic and print them to stdout
  produce     Read messages from stdin and write them to a Kafka topic
  replay      Read messages from a Kafka topic and append them to the end of a topic

Additional Commands:
  completion  Generate the autocompletion script for the specified shell

Flags:
      --config string    path to kafkactl config file (default "/home/fgrosse/.config/kafkactl/config.yml")
      --context string   the name of the kafkactl context to use (defaults to "current_context" field from config file)
  -h, --help             help for kafkactl
  -v, --verbose          enable verbose output

Use "kafkactl [command] --help" for more information about a command.

Getting started

The first thing you need to do after installing kafkactl is to set up a configuration context. Each context contains all information to connect to a set of Kafka brokers which form a Kafka cluster.

kafkactl config add "my-context" --broker example.com:9092

This will add and activate a new configuration context named "my-context". Each subsequent command that interacts with Kafka will be directed towards the brokers of the currently active configuration context. This is similar to how kubectl manages different contexts to talk to multiple Kubernetes clusters.

You can also add, delete and rename configuration contexts as well as print the full kafkactl configuration using kafkactl config and its sub-commands. If you want to learn more, try passing the --help flag.

Query information

Now you can run your first kafkactl command to query information from the cluster:

$ kafkactl get brokers                     
ID      ADDRESS      ROLE
1       kafka1:9092  controller  
2       kafka2:9093              
3       kafka3:9094

$ kafkactl get topics
NAME    PARTITIONS  REPLICATION  RETENTION
test-1  2           1            2 weeks   
test-2  4           3            7 days    
test-3  10          3            12 hours 

Apart from brokers and topics, you can also query information about the Kafka cluster configuration, all known consumer groups and fetch individual messages using kafkactl get.

Output encoding

All sub commands of kafkactl get default to printing information in a human friendly way (e.g. as a table). They also have an --output flag which allows for other output encoding that is more suitable for scripting (e.g. JSON) and often contains more information than what fits into the tabular format.

Writing to Kafka

You can use kafkactl to create and delete topics and update configuration. Please refer to the corresponding kafkactl --help output for more information and examples.

Other kafkactls

There are multiple applications that call themselves kafkactl. All of them have been developed independently but with similar feature sets.

This incarnation of kafkactl was created at Fraugster in September 2017. It was a useful tool for many years and we decided to keep it around even after Fraugster ceased to exist, mainly because we are very used to it and maybe for sentimental reasons.

Other kafkactl implementations come with similar features (e.g. Protobuf & Avro support, managing configuration with kubectl-like contexts). We list them here, so you can pick the tool that serves your use case best:

Built With

  • sarama - a Go library for Apache Kafka
  • confluent-kafka-go - Confluent's Apache Kafka Golang client
  • goavro - a library that encodes and decodes Avro data
  • cobra - a library to build powerful CLI applications
  • viper - configuration with fangs
  • protoreflect - reflection for Go Protocol Buffers
  • testify - A simple unit test library
  • and more

Contributing

Please read CONTRIBUTING.md for details on our code of conduct and on the process for submitting pull requests to this repository.

Versioning

We use SemVer for versioning. All significant (e.g. breaking) changes are documented in the CHANGELOG.md. A list of all available versions can be found at the releases page.

Authors

See also the list of contributors who participated in this project.

License

This project is licensed under the BSD-3-Clause License - see the LICENSE file for details.