All Projects → confluentinc → Confluent Kafka Go

confluentinc / Confluent Kafka Go

Licence: apache-2.0
Confluent's Apache Kafka Golang client

Programming Languages

go
31211 projects - #10 most used programming language
c
50402 projects - #5 most used programming language
HTML
75241 projects

Projects that are alternatives of or similar to Confluent Kafka Go

Kafka Go
Kafka library in Go
Stars: ✭ 4,200 (+37.84%)
Mutual labels:  consumer, kafka-client, producer
Confluent Kafka Python
Confluent's Kafka Python Client
Stars: ✭ 2,578 (-15.39%)
Mutual labels:  kafka-client, confluent, librdkafka
youtube-dl-nas
youtube download queue websocket server with login for private NAS.
Stars: ✭ 136 (-95.54%)
Mutual labels:  consumer, producer
frizzle
The magic message bus
Stars: ✭ 14 (-99.54%)
Mutual labels:  consumer, producer
Confluent Kafka Dotnet
Confluent's Apache Kafka .NET client
Stars: ✭ 2,110 (-30.75%)
Mutual labels:  kafka-client, confluent
Qmq
QMQ是去哪儿网内部广泛使用的消息中间件,自2012年诞生以来在去哪儿网所有业务场景中广泛的应用,包括跟交易息息相关的订单场景; 也包括报价搜索等高吞吐量场景。
Stars: ✭ 2,420 (-20.58%)
Mutual labels:  consumer, producer
pulsar-flex
Pulsar Flex is a modern Apache Pulsar client for Node.js, developed to be independent of C++.
Stars: ✭ 43 (-98.59%)
Mutual labels:  consumer, producer
php-kafka-lib
PHP Kafka producer / consumer library with PHP Avro support, based on php-rdkafka
Stars: ✭ 38 (-98.75%)
Mutual labels:  consumer, producer
rocketmq
RocketMQ client for go supportting producer and consumer.
Stars: ✭ 29 (-99.05%)
Mutual labels:  consumer, producer
Node Sinek
🎩 Most advanced high level Node.js Kafka client
Stars: ✭ 262 (-91.4%)
Mutual labels:  consumer, kafka-client
Librdkafka
The Apache Kafka C/C++ library
Stars: ✭ 5,617 (+84.35%)
Mutual labels:  consumer, librdkafka
Ksql Udf Deep Learning Mqtt Iot
Deep Learning UDF for KSQL for Streaming Anomaly Detection of MQTT IoT Sensor Data
Stars: ✭ 219 (-92.81%)
Mutual labels:  kafka-client, confluent
Kafka Rest
Confluent REST Proxy for Kafka
Stars: ✭ 1,863 (-38.86%)
Mutual labels:  confluent
Fs2 Kafka
Functional Kafka Streams for Scala
Stars: ✭ 183 (-93.99%)
Mutual labels:  kafka-client
Flogo
Project Flogo is an open source ecosystem of opinionated event-driven capabilities to simplify building efficient & modern serverless functions, microservices & edge apps.
Stars: ✭ 1,891 (-37.94%)
Mutual labels:  kafka-client
Strimzi Kafka Bridge
Apache Kafka bridge
Stars: ✭ 137 (-95.5%)
Mutual labels:  kafka-client
Phobos
Simplifying Kafka for ruby apps
Stars: ✭ 176 (-94.22%)
Mutual labels:  kafka-client
Node Rdkafka
Node.js bindings for librdkafka
Stars: ✭ 1,799 (-40.96%)
Mutual labels:  librdkafka
Simpleue
PHP queue worker and consumer - Ready for AWS SQS, Redis, Beanstalkd and others.
Stars: ✭ 124 (-95.93%)
Mutual labels:  consumer
Franz Go
franz-go contains a high performance, pure Go library for interacting with Kafka from 0.8.0 through 2.7.0+. Producing, consuming, transacting, administrating, etc.
Stars: ✭ 199 (-93.47%)
Mutual labels:  kafka-client

Confluent's Golang Client for Apache KafkaTM

confluent-kafka-go is Confluent's Golang client for Apache Kafka and the Confluent Platform.

Features:

  • High performance - confluent-kafka-go is a lightweight wrapper around librdkafka, a finely tuned C client.

  • Reliability - There are a lot of details to get right when writing an Apache Kafka client. We get them right in one place (librdkafka) and leverage this work across all of our clients (also confluent-kafka-python and confluent-kafka-dotnet).

  • Supported - Commercial support is offered by Confluent.

  • Future proof - Confluent, founded by the creators of Kafka, is building a streaming platform with Apache Kafka at its core. It's high priority for us that client features keep pace with core Apache Kafka and components of the Confluent Platform.

The Golang bindings provides a high-level Producer and Consumer with support for the balanced consumer groups of Apache Kafka 0.9 and above.

See the API documentation for more information.

Examples

High-level balanced consumer

import (
	"fmt"
	"github.com/confluentinc/confluent-kafka-go/kafka"
)

func main() {

	c, err := kafka.NewConsumer(&kafka.ConfigMap{
		"bootstrap.servers": "localhost",
		"group.id":          "myGroup",
		"auto.offset.reset": "earliest",
	})

	if err != nil {
		panic(err)
	}

	c.SubscribeTopics([]string{"myTopic", "^aRegex.*[Tt]opic"}, nil)

	for {
		msg, err := c.ReadMessage(-1)
		if err == nil {
			fmt.Printf("Message on %s: %s\n", msg.TopicPartition, string(msg.Value))
		} else {
			// The client will automatically try to recover from all errors.
			fmt.Printf("Consumer error: %v (%v)\n", err, msg)
		}
	}

	c.Close()
}

Producer

import (
	"fmt"
	"github.com/confluentinc/confluent-kafka-go/kafka"
)

func main() {

	p, err := kafka.NewProducer(&kafka.ConfigMap{"bootstrap.servers": "localhost"})
	if err != nil {
		panic(err)
	}

	defer p.Close()

	// Delivery report handler for produced messages
	go func() {
		for e := range p.Events() {
			switch ev := e.(type) {
			case *kafka.Message:
				if ev.TopicPartition.Error != nil {
					fmt.Printf("Delivery failed: %v\n", ev.TopicPartition)
				} else {
					fmt.Printf("Delivered message to %v\n", ev.TopicPartition)
				}
			}
		}
	}()

	// Produce messages to topic (asynchronously)
	topic := "myTopic"
	for _, word := range []string{"Welcome", "to", "the", "Confluent", "Kafka", "Golang", "client"} {
		p.Produce(&kafka.Message{
			TopicPartition: kafka.TopicPartition{Topic: &topic, Partition: kafka.PartitionAny},
			Value:          []byte(word),
		}, nil)
	}

	// Wait for message deliveries before shutting down
	p.Flush(15 * 1000)
}

More elaborate examples are available in the examples directory, including how to configure the Go client for use with Confluent Cloud.

Getting Started

Supports Go 1.11+ and librdkafka 1.6.0+.

Using Go Modules

Starting with Go 1.13, you can use Go Modules to install confluent-kafka-go.

Import the kafka package from GitHub in your code:

import "github.com/confluentinc/confluent-kafka-go/kafka"

Build your project:

go build ./...

If you are building for Alpine Linux (musl), -tags musl must be specified.

go build -tags musl ./...

A dependency to the latest stable version of confluent-kafka-go should be automatically added to your go.mod file.

Install the client

If Go modules can't be used we recommend that you version pin the confluent-kafka-go import to v1 using gopkg.in:

Manual install:

go get -u gopkg.in/confluentinc/confluent-kafka-go.v1/kafka

Golang import:

import "gopkg.in/confluentinc/confluent-kafka-go.v1/kafka"

librdkafka

Prebuilt librdkafka binaries are included with the Go client and librdkafka does not need to be installed separately on the build or target system. The following platforms are supported by the prebuilt librdkafka binaries:

  • Mac OSX x64
  • glibc-based Linux x64 (e.g., RedHat, Debian, CentOS, Ubuntu, etc) - without GSSAPI/Kerberos support
  • musl-based Linux 64 (Alpine) - without GSSAPI/Kerberos support

When building your application for Alpine Linux (musl libc) you must pass -tags musl to go get, go build, etc.

CGO_ENABLED must NOT be set to 0 since the Go client is based on the C library librdkafka.

If GSSAPI/Kerberos authentication support is required you will need to install librdkafka separately, see the Installing librdkafka chapter below, and then build your Go application with -tags dynamic.

Installing librdkafka

If the bundled librdkafka build is not supported on your platform, or you need a librdkafka with GSSAPI/Kerberos support, you must install librdkafka manually on the build and target system using one of the following alternatives:

  • For Debian and Ubuntu based distros, install librdkafka-dev from the standard repositories or using Confluent's Deb repository.
  • For Redhat based distros, install librdkafka-devel using Confluent's YUM repository.
  • For MacOS X, install librdkafka from Homebrew. You may also need to brew install pkg-config if you don't already have it: brew install librdkafka pkg-config.
  • For Alpine: apk add librdkafka-dev pkgconf
  • confluent-kafka-go is not supported on Windows.
  • For source builds, see instructions below.

Build from source:

git clone https://github.com/edenhill/librdkafka.git
cd librdkafka
./configure
make
sudo make install

After installing librdkafka you will need to build your Go application with -tags dynamic.

Note: If you use the master branch of the Go client, then you need to use the master branch of librdkafka.

confluent-kafka-go requires librdkafka v1.6.0 or later.

API Strands

There are two main API strands: function and channel-based.

Function-Based Consumer

Messages, errors and events are polled through the consumer.Poll() function.

Pros:

  • More direct mapping to underlying librdkafka functionality.

Cons:

  • Makes it harder to read from multiple channels, but a go-routine easily solves that (see Cons in channel-based consumer below about outdated events).
  • Slower than the channel consumer.

See examples/consumer_example

Channel-Based Consumer (deprecated)

Deprecated: The channel-based consumer is deprecated due to the channel issues mentioned below. Use the function-based consumer.

Messages, errors and events are posted on the consumer.Events() channel for the application to read.

Pros:

  • Possibly more Golang:ish
  • Makes reading from multiple channels easy
  • Fast

Cons:

  • Outdated events and messages may be consumed due to the buffering nature of channels. The extent is limited, but not remedied, by the Events channel buffer size (go.events.channel.size).

See examples/consumer_channel_example

Channel-Based Producer

Application writes messages to the producer.ProducerChannel(). Delivery reports are emitted on the producer.Events() or specified private channel.

Pros:

  • Go:ish
  • Proper channel backpressure if librdkafka internal queue is full.

Cons:

  • Double queueing: messages are first queued in the channel (size is configurable) and then inside librdkafka.

See examples/producer_channel_example

Function-Based Producer

Application calls producer.Produce() to produce messages. Delivery reports are emitted on the producer.Events() or specified private channel.

Pros:

  • Go:ish

Cons:

  • Produce() is a non-blocking call, if the internal librdkafka queue is full the call will fail.
  • Somewhat slower than the channel producer.

See examples/producer_example

License

Apache License v2.0

KAFKA is a registered trademark of The Apache Software Foundation and has been licensed for use by confluent-kafka-go. confluent-kafka-go has no affiliation with and is not endorsed by The Apache Software Foundation.

Developer Notes

See kafka/README

Contributions to the code, examples, documentation, et.al, are very much appreciated.

Make your changes, run gofmt, tests, etc, push your branch, create a PR, and sign the CLA.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].