All Projects → fgeller → Kt

fgeller / Kt

Licence: mit
Kafka command line tool that likes JSON

Programming Languages

go
31211 projects - #10 most used programming language
golang
3204 projects

Projects that are alternatives of or similar to Kt

Kafkactl
Command Line Tool for managing Apache Kafka
Stars: ✭ 177 (-77.85%)
Mutual labels:  cli, apache-kafka, kafka
Debezium
Change data capture for a variety of databases. Please log issues at https://issues.redhat.com/browse/DBZ.
Stars: ✭ 5,937 (+643.05%)
Mutual labels:  apache-kafka, kafka
Zerocode
A community-developed, free, open source, microservices API automation and load testing framework built using JUnit core runners for Http REST, SOAP, Security, Database, Kafka and much more. Zerocode Open Source enables you to create, change, orchestrate and maintain your automated test cases declaratively with absolute ease.
Stars: ✭ 482 (-39.67%)
Mutual labels:  json, kafka
Librdkafka
The Apache Kafka C/C++ library
Stars: ✭ 5,617 (+603%)
Mutual labels:  apache-kafka, kafka
Agile data code 2
Code for Agile Data Science 2.0, O'Reilly 2017, Second Edition
Stars: ✭ 413 (-48.31%)
Mutual labels:  apache-kafka, kafka
Remarshal
Convert between CBOR, JSON, MessagePack, TOML, and YAML
Stars: ✭ 421 (-47.31%)
Mutual labels:  cli, json
Kq
Kafka-based Job Queue for Python
Stars: ✭ 530 (-33.67%)
Mutual labels:  apache-kafka, kafka
Awesome Kafka
A list about Apache Kafka
Stars: ✭ 397 (-50.31%)
Mutual labels:  apache-kafka, kafka
Ponzu
Headless CMS with automatic JSON API. Featuring auto-HTTPS from Let's Encrypt, HTTP/2 Server Push, and flexible server framework written in Go.
Stars: ✭ 5,373 (+572.47%)
Mutual labels:  cli, json
Kafka Pixy
gRPC/REST proxy for Kafka
Stars: ✭ 613 (-23.28%)
Mutual labels:  json, kafka
Structured Text Tools
A list of command line tools for manipulating structured text data
Stars: ✭ 6,180 (+673.47%)
Mutual labels:  cli, json
Cppkafka
Modern C++ Apache Kafka client library (wrapper for librdkafka)
Stars: ✭ 413 (-48.31%)
Mutual labels:  apache-kafka, kafka
Node Minify
Light Node.js module that compress javascript, css and html files
Stars: ✭ 404 (-49.44%)
Mutual labels:  cli, json
Jtc
JSON processing utility
Stars: ✭ 425 (-46.81%)
Mutual labels:  cli, json
Kafka Connect Hdfs
Kafka Connect HDFS connector
Stars: ✭ 400 (-49.94%)
Mutual labels:  apache-kafka, kafka
Ramda Cli
🐏 A CLI tool for processing data with functional pipelines
Stars: ✭ 515 (-35.54%)
Mutual labels:  cli, json
Kafka Storm Starter
Code examples that show to integrate Apache Kafka 0.8+ with Apache Storm 0.9+ and Apache Spark Streaming 1.1+, while using Apache Avro as the data serialization format.
Stars: ✭ 728 (-8.89%)
Mutual labels:  apache-kafka, kafka
Resume Cli
CLI tool to easily setup a new resume 📑
Stars: ✭ 3,967 (+396.5%)
Mutual labels:  cli, json
Kafka Sprout
🚀 Web GUI for Kafka Cluster Management
Stars: ✭ 388 (-51.44%)
Mutual labels:  apache-kafka, kafka
Trdsql
CLI tool that can execute SQL queries on CSV, LTSV, JSON and TBLN. Can output to various formats.
Stars: ✭ 593 (-25.78%)
Mutual labels:  cli, json

kt - a Kafka tool that likes JSON Build Status

Some reasons why you might be interested:

  • Consume messages on specific partitions between specific offsets.
  • Display topic information (e.g., with partition offset and leader info).
  • Modify consumer group offsets (e.g., resetting or manually setting offsets per topic and per partition).
  • JSON output for easy consumption with tools like kp or jq.
  • JSON input to facilitate automation via tools like jsonify.
  • Configure brokers, topic and authentication via environment variables KT_BROKERS, KT_TOPIC and KT_AUTH.
  • Fast start up time.
  • No buffering of output.
  • Binary keys and payloads can be passed and presented in base64 or hex encoding.
  • Support for TLS authentication.
  • Basic cluster admin functions: Create & delete topics.

I'm not using kt actively myself anymore, so if you think it's lacking some feature - please let me know by creating an issue!

Examples

Read details about topics that match a regex
$ kt topic -filter news -partitions
{
  "name": "actor-news",
  "partitions": [
    {
      "id": 0,
      "oldest": 0,
      "newest": 0
    }
  ]
}
Produce messages
$ echo 'Alice wins Oscar' | kt produce -topic actor-news -literal
{
  "count": 1,
  "partition": 0,
  "startOffset": 0
}
$ echo 'Bob wins Oscar' | kt produce -topic actor-news -literal
{
  "count": 1,
  "partition": 0,
  "startOffset": 0
}
$ for i in {6..9} ; do echo Bourne sequel $i in production. | kt produce -topic actor-news -literal ;done
{
  "count": 1,
  "partition": 0,
  "startOffset": 1
}
{
  "count": 1,
  "partition": 0,
  "startOffset": 2
}
{
  "count": 1,
  "partition": 0,
  "startOffset": 3
}
{
  "count": 1,
  "partition": 0,
  "startOffset": 4
}
Or pass in JSON object to control key, value and partition
$ echo '{"value": "Terminator terminated", "key": "Arni", "partition": 0}' | kt produce -topic actor-news
{
  "count": 1,
  "partition": 0,
  "startOffset": 5
}
Read messages at specific offsets on specific partitions
$ kt consume -topic actor-news -offsets 0=1:2
{
  "partition": 0,
  "offset": 1,
  "key": "",
  "value": "Bourne sequel 6 in production.",
  "timestamp": "1970-01-01T00:59:59.999+01:00"
}
{
  "partition": 0,
  "offset": 2,
  "key": "",
  "value": "Bourne sequel 7 in production.",
  "timestamp": "1970-01-01T00:59:59.999+01:00"
}
Follow a topic, starting relative to newest offset
$ kt consume -topic actor-news -offsets all=newest-1:
{
  "partition": 0,
  "offset": 4,
  "key": "",
  "value": "Bourne sequel 9 in production.",
  "timestamp": "1970-01-01T00:59:59.999+01:00"
}
{
  "partition": 0,
  "offset": 5,
  "key": "Arni",
  "value": "Terminator terminated",
  "timestamp": "1970-01-01T00:59:59.999+01:00"
}
^Creceived interrupt - shutting down
shutting down partition consumer for partition 0
View offsets for a given consumer group
$ kt group -group enews -topic actor-news -partitions 0
found 1 groups
found 1 topics
{
  "name": "enews",
  "topic": "actor-news",
  "offsets": [
    {
      "partition": 0,
      "offset": 6,
      "lag": 0
    }
  ]
}
Change consumer group offset
$ kt group -group enews -topic actor-news -partitions 0 -reset 1
found 1 groups
found 1 topics
{
  "name": "enews",
  "topic": "actor-news",
  "offsets": [
    {
      "partition": 0,
      "offset": 1,
      "lag": 5
    }
  ]
}
$ kt group -group enews -topic actor-news -partitions 0
found 1 groups
found 1 topics
{
  "name": "enews",
  "topic": "actor-news",
  "offsets": [
    {
      "partition": 0,
      "offset": 1,
      "lag": 5
    }
  ]
}
Create and delete a topic
$ kt admin -createtopic morenews -topicdetail <(jsonify =NumPartitions 1 =ReplicationFactor 1)
$ kt topic -filter news
{
  "name": "morenews"
}
$ kt admin -deletetopic morenews
$ kt topic -filter news
Change broker address via environment variable
$ export KT_BROKERS=brokers.kafka:9092
$ kt <command> <option>

Installation

You can download kt via the Releases section.

Alternatively, the usual way via the go tool, for example:

$ go get -u github.com/fgeller/kt

Or via Homebrew on OSX:

$ brew tap fgeller/tap
$ brew install kt

Docker

@Paxa maintains an image to run kt in a Docker environment - thanks!

For more information: https://github.com/Paxa/kt

Usage:

$ kt -help
kt is a tool for Kafka.

Usage:

        kt command [arguments]

The commands are:

        consume        consume messages.
        produce        produce messages.
        topic          topic information.
        group          consumer group information and modification.
        admin          basic cluster administration.

Use "kt [command] -help" for for information about the command.

Authentication:

Authentication with Kafka can be configured via a JSON file.
You can set the file name via an "-auth" flag to each command or
set it via the environment variable KT_AUTH.

Authentication / Encryption

Authentication configuration is possibly via a JSON file. You indicate the mode of authentication you need and provide additional information as required for your mode. You pass the path to your configuration file via the -auth flag to each command individually, or set it via the environment variable KT_AUTH.

TLS

Required fields:

  • mode: This needs to be set to TLS
  • client-certificate: Path to your certificate
  • client-certificate-key: Path to your certificate key
  • ca-certificate: Path to your CA certificate

Example for an authorization configuration that is used for the system tests:

{
    "mode": "TLS",
    "client-certificate": "test-secrets/kt-test.crt",
    "client-certificate-key": "test-secrets/kt-test.key",
    "ca-certificate": "test-secrets/snakeoil-ca-1.crt"
}

TLS one-way

Required fields:

  • mode: This needs to be set to TLS-1way

Example:

{
    "mode": "TLS-1way",
}

Other modes

Please create an issue with details for the mode that you need.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].