All Projects → ovotech → kafka-avro-confluent

ovotech / kafka-avro-confluent

Licence: EPL-1.0 license
Kafka De/Serializer using avro and Confluent's Schema Registry

Programming Languages

clojure
4091 projects

Projects that are alternatives of or similar to kafka-avro-confluent

Schema Registry
Confluent Schema Registry for Kafka
Stars: ✭ 1,647 (+9050%)
Mutual labels:  avro, confluent
avrora
A convenient Elixir library to work with Avro schemas and Confluent® Schema Registry
Stars: ✭ 59 (+227.78%)
Mutual labels:  avro, confluent
srclient
Golang Client for Schema Registry
Stars: ✭ 188 (+944.44%)
Mutual labels:  avro, confluent
Examples
Demo applications and code examples for Confluent Platform and Apache Kafka
Stars: ✭ 571 (+3072.22%)
Mutual labels:  avro, confluent
avro-serde-php
Avro Serialisation/Deserialisation (SerDe) library for PHP 7.3+ & 8.0 with a Symfony Serializer integration
Stars: ✭ 43 (+138.89%)
Mutual labels:  avro, confluent
Open Bank Mark
A bank simulation application using mainly Clojure, which can be used to end-to-end test and show some graphs.
Stars: ✭ 81 (+350%)
Mutual labels:  avro, confluent
confluent-spark-avro
Spark UDFs to deserialize Avro messages with schemas stored in Schema Registry.
Stars: ✭ 18 (+0%)
Mutual labels:  avro, confluent
Schema Registry
A CLI and Go client for Kafka Schema Registry
Stars: ✭ 105 (+483.33%)
Mutual labels:  avro, confluent
schema-registry-php-client
A PHP 7.3+ API client for the Confluent Schema Registry REST API based on Guzzle 6 - http://docs.confluent.io/current/schema-registry/docs/index.html
Stars: ✭ 40 (+122.22%)
Mutual labels:  avro, confluent
Storagetapper
StorageTapper is a scalable realtime MySQL change data streaming, logical backup and logical replication service
Stars: ✭ 232 (+1188.89%)
Mutual labels:  avro
parquet-flinktacular
How to use Parquet in Flink
Stars: ✭ 29 (+61.11%)
Mutual labels:  avro
Mu Haskell
Mu (μ) is a purely functional framework for building micro services.
Stars: ✭ 215 (+1094.44%)
Mutual labels:  avro
Vscode Data Preview
Data Preview 🈸 extension for importing 📤 viewing 🔎 slicing 🔪 dicing 🎲 charting 📊 & exporting 📥 large JSON array/config, YAML, Apache Arrow, Avro, Parquet & Excel data files
Stars: ✭ 245 (+1261.11%)
Mutual labels:  avro
ksql-jdbc-driver
JDBC driver for Apache Kafka
Stars: ✭ 85 (+372.22%)
Mutual labels:  confluent
Jackson Dataformats Binary
Uber-project for standard Jackson binary format backends: avro, cbor, ion, protobuf, smile
Stars: ✭ 221 (+1127.78%)
Mutual labels:  avro
Kafkactl
Command Line Tool for managing Apache Kafka
Stars: ✭ 177 (+883.33%)
Mutual labels:  avro
Bigdata Playground
A complete example of a big data application using : Kubernetes (kops/aws), Apache Spark SQL/Streaming/MLib, Apache Flink, Scala, Python, Apache Kafka, Apache Hbase, Apache Parquet, Apache Avro, Apache Storm, Twitter Api, MongoDB, NodeJS, Angular, GraphQL
Stars: ✭ 177 (+883.33%)
Mutual labels:  avro
singlestore-logistics-sim
Scalable package delivery logistics simulator built using SingleStore and Vectorized Redpanda
Stars: ✭ 31 (+72.22%)
Mutual labels:  avro
parquet-extra
A collection of Apache Parquet add-on modules
Stars: ✭ 30 (+66.67%)
Mutual labels:  avro
kafka-scala-examples
Examples of Avro, Kafka, Schema Registry, Kafka Streams, Interactive Queries, KSQL, Kafka Connect in Scala
Stars: ✭ 53 (+194.44%)
Mutual labels:  avro

kafka-avro-confluent CircleCI

Kafka De/Serializer using avro and Confluent's Schema Registry

Migrating from 1.1.1-4 -> 1.1.1-5+

1.1.1-5 adds support for logical types. This support is automatic and will be used both in serialisation and deserialisation. If you need to deserialise to the underlying primitive types you will need to disable logical type conversion at the point of creating your deserialiser i.e.

(des/->avro-deserializer schema-registry :convert-logical-types? false)

Usage

Clojars Project

[ovotech/kafka-avro-confluent "2.1.0-7"]
(ns kafka-avro-confluent.readme-test
  (:require [kafka-avro-confluent.v2.deserializer :as des]
            [kafka-avro-confluent.v2.serializer :as ser]))

;; initialise the Confluent Schema Registry client:
(def config
  {;; NOTE auth optional!
   ;; :schema-registry/username "mr.anderson"
   ;; :schema-registry/password "42"
   :schema-registry/base-url "http://localhost:8081"})

;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; # Deserializer
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; implements org.apache.kafka.common.serialization.Deserializer

(des/->avro-deserializer config)

;; Without using logical types
(binding [abracad.avro.conversion/*use-logical-types* false]
  (des/->avro-deserializer schema-registry))

;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; # Serializer
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; implements org.apache.kafka.common.serialization.Serializer

;; meant to be used as a value-serializer in a KafkaProducer
(ser/->avro-serializer config schema)

;; If you want to use it as a key-serializer:
(ser/->avro-serializer config :key schema)

;; Using with a KafkaProducer:
;; e.g. (org.apache.kafka.clients.producer.KafkaProducer. key-serializer
;;                                                        value-serializer)

;; If the serializer will be used with multiple topics (each with its own schema):
(ser/->avro-serializer config (ser/->schemas-definition {topic1 schema1
                                                         topic2 schema2}))

Versions

The versions use this format:

${kafka_version}-${build_number}

For example:

0.10.0-4 # Kafka v = 0.10.0, kafka-avro-confluent build = 4
1.0.1-1  # Kafka v = 1.0.1 , kafka-avro-confluent build = 1

License

Copyright © 2017 OVO Energy Ltd.

Distributed under the Eclipse Public License either version 1.0 or (at your option) any later version.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].