All Projects → siddhi-io → siddhi-io-kafka

siddhi-io / siddhi-io-kafka

Licence: Apache-2.0 license
Extension that can be used to receive events from a Kafka cluster and to publish events to a Kafka cluster

Programming Languages

java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to siddhi-io-kafka

wulaphp
一个有点复杂的PHP框架!
Stars: ✭ 26 (+62.5%)
Mutual labels:  extension
solidus static content
📄 Content management for your Solidus store.
Stars: ✭ 18 (+12.5%)
Mutual labels:  extension
audria
audria - A Utility for Detailed Ressource Inspection of Applications
Stars: ✭ 35 (+118.75%)
Mutual labels:  io
Rebus.SqlServer
🚌 Microsoft SQL Server transport and persistence for Rebus
Stars: ✭ 35 (+118.75%)
Mutual labels:  message-queue
clever-vscode
Clever help to put vscode to the next level
Stars: ✭ 17 (+6.25%)
Mutual labels:  extension
wave
MQTT Broker - for IoT, DIY, pubsub applications and more
Stars: ✭ 24 (+50%)
Mutual labels:  message-queue
block-site
Chrome extension that blocks access to distracting websites to improve your productivity.
Stars: ✭ 81 (+406.25%)
Mutual labels:  extension
jdk-source-code-reading
JDK source code reading
Stars: ✭ 19 (+18.75%)
Mutual labels:  io
hanbo-db
hanboDB is a high available,low latency memory database system
Stars: ✭ 29 (+81.25%)
Mutual labels:  message-queue
1click-webpage-screenshot
Entire page Screenshot extension for Google Chrome. I'm developing open source extension for Google Chrome. All extension are free for use. Let's make Chrome great again!
Stars: ✭ 432 (+2600%)
Mutual labels:  extension
VBCorLib
The VBCorLib framework brings many of the powerful .NET classes to VB6.
Stars: ✭ 81 (+406.25%)
Mutual labels:  io
kanboard chrome extension
Kanboard Chrome Extension
Stars: ✭ 13 (-18.75%)
Mutual labels:  extension
burp-ntlm-challenge-decoder
Burp extension to decode NTLM SSP headers and extract domain/host information
Stars: ✭ 28 (+75%)
Mutual labels:  extension
Flask-GraphQL-Auth
(UNMAINTAINED. FEEL FREE TO FORK) 🐍A Pythonic way to provide JWT authentication for Flask-GraphQL
Stars: ✭ 64 (+300%)
Mutual labels:  extension
9anime-Companion
🚀 A simple companion extension for 9anime
Stars: ✭ 83 (+418.75%)
Mutual labels:  extension
hev-task-system
A simple, lightweight multi-task system (coroutines) for Unix (Linux/BSD/macOS)
Stars: ✭ 41 (+156.25%)
Mutual labels:  io
sqlops-widgets
SQL Operations Studio Dashboard Widgets - including Always ON
Stars: ✭ 22 (+37.5%)
Mutual labels:  extension
io-api
📐 API design example by I/O, the demo implementation of https://dzone.com/articles/generic-inputoutput-api-java
Stars: ✭ 46 (+187.5%)
Mutual labels:  io
inplace
In-place file processing in Python
Stars: ✭ 21 (+31.25%)
Mutual labels:  io
vscode-ascii-tree-generator
Generate ASCII tree of directories or format selected text into its corresponding "tree string" representation.
Stars: ✭ 35 (+118.75%)
Mutual labels:  extension

Siddhi IO Kafka

Jenkins Build Status GitHub Release GitHub Release Date GitHub Open Issues GitHub Last Commit License

The siddhi-io-kafka extension is an extension to Siddhi that receives and publishes events from and to Kafka.

For information on Siddhi and it's features refer Siddhi Documentation.

Download

  • Versions 5.x and above with group id io.siddhi.extension.* from here.
  • Versions 4.x and lower with group id org.wso2.extension.siddhi.* from here.

Latest API Docs

Latest API Docs is 5.0.16.

Features

  • kafka (Sink)

    A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster. The events can be published in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic. The publishing topic and partition can be a dynamic value taken from the Siddhi event.
    To configure a sink to use the Kafka transport, the type parameter should have kafka as its value.

  • kafka-replay-request (Sink)

    This sink is used to request replay of specific range of events on a specified partition of a topic.

  • kafkaMultiDC (Sink)

    A Kafka sink publishes events processed by WSO2 SP to a topic with a partition for a Kafka cluster. The events can be published in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic. The publishing topic and partition can be a dynamic value taken from the Siddhi event.
    To configure a sink to publish events via the Kafka transport, and using two Kafka brokers to publish events to the same topic, the type parameter must have kafkaMultiDC as its value.

  • kafka (Source)

    A Kafka source receives events to be processed by WSO2 SP from a topic with a partition for a Kafka cluster. The events received can be in the TEXT XML JSON or Binary format.
    If the topic is not already created in the Kafka cluster, the Kafka sink creates the default partition for the given topic.

  • kafka-replay-response (Source)

    This source is used to listen to replayed events requested from kafka-replay-request sink

  • kafkaMultiDC (Source)

    The Kafka Multi-Datacenter(DC) source receives records from the same topic in brokers deployed in two different kafka clusters. It filters out all the duplicate messages and ensuresthat the events are received in the correct order using sequential numbering. It receives events in formats such as TEXT, XML JSON and Binary`.The Kafka Source creates the default partition '0' for a given topic, if the topic has not yet been created in the Kafka cluster.

Installation

For installing this extension in the Streaming Integrator Server, and to add the dependent jars, refer Streaming Integrator documentation section on downloading and installing siddhi extensions.
For installing this extension in the Streaming Integrator Tooling, and to add the dependent jars, refer Streaming Integrator documentation section on installing siddhi extensions.

Dependencies

Following JARs will be converted to osgi and copied to WSO2SI_HOME/lib and WSO2SI_HOME/samples/sample-clients/lib which are in <KAFKA_HOME>/libs directory.

  • kafka_2.11-*.jar
  • kafka-clients-*.jar
  • metrics-core-*.jar
  • scala-library-2.11.*.jar
  • scala-parser-combinators_2.11.*.jar (if exists)
  • zkclient-*.jar
  • zookeeper-*.jar

Setup Kafka

As a prerequisite, you have to start the Kafka message broker. Please follow better steps.

  1. Download the Kafka distribution
  2. Unzip the above distribution and go to the ‘bin’ directory
  3. Start the zookeeper by executing below command,
    zookeeper-server-start.sh config/zookeeper.properties
  4. Start the Kafka broker by executing below command,
    kafka-server-start.sh config/server.properties

Refer the Kafka documentation for more details, https://kafka.apache.org/quickstart

Support and Contribution

  • We encourage users to ask questions and get support via StackOverflow, make sure to add the siddhi tag to the issue for better response.

  • If you find any issues related to the extension please report them on the issue tracker.

  • For production support and other contribution related information refer Siddhi Community documentation.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].