All Projects → ch-robinson → dotnet-avro

ch-robinson / dotnet-avro

Licence: MIT License
An Avro implementation for .NET

Programming Languages

C#
18002 projects

Projects that are alternatives of or similar to dotnet-avro

confluent-spark-avro
Spark UDFs to deserialize Avro messages with schemas stored in Schema Registry.
Stars: ✭ 18 (-70%)
Mutual labels:  avro, schema-registry
spring-cloud-stream-event-sourcing-testcontainers
Goal: create a Spring Boot application that handles users using Event Sourcing. So, whenever a user is created, updated, or deleted, an event informing this change is sent to Kafka. Also, we will implement another application that listens to those events and saves them in Cassandra. Finally, we will use Testcontainers for integration testing.
Stars: ✭ 16 (-73.33%)
Mutual labels:  avro, schema-registry
schema-registry
📙 json & avro http schema registry backed by Kafka
Stars: ✭ 23 (-61.67%)
Mutual labels:  avro, schema-registry
tamer
Standalone alternatives to Kafka Connect Connectors
Stars: ✭ 42 (-30%)
Mutual labels:  avro, schema-registry
sbt-avro
Plugin SBT to Generate Scala classes from Apache Avro schemas hosted on a remote Confluent Schema Registry.
Stars: ✭ 15 (-75%)
Mutual labels:  avro, schema-registry
Schema Registry
Confluent Schema Registry for Kafka
Stars: ✭ 1,647 (+2645%)
Mutual labels:  avro, schema-registry
avro turf
A library that makes it easier to use the Avro serialization format from Ruby.
Stars: ✭ 130 (+116.67%)
Mutual labels:  avro, schema-registry
schema-registry-php-client
A PHP 7.3+ API client for the Confluent Schema Registry REST API based on Guzzle 6 - http://docs.confluent.io/current/schema-registry/docs/index.html
Stars: ✭ 40 (-33.33%)
Mutual labels:  avro, schema-registry
kafka-scala-examples
Examples of Avro, Kafka, Schema Registry, Kafka Streams, Interactive Queries, KSQL, Kafka Connect in Scala
Stars: ✭ 53 (-11.67%)
Mutual labels:  avro, schema-registry
avrora
A convenient Elixir library to work with Avro schemas and Confluent® Schema Registry
Stars: ✭ 59 (-1.67%)
Mutual labels:  avro, schema-registry
DaFlow
Apache-Spark based Data Flow(ETL) Framework which supports multiple read, write destinations of different types and also support multiple categories of transformation rules.
Stars: ✭ 24 (-60%)
Mutual labels:  avro
avrocount
Count records in Avro files efficiently
Stars: ✭ 16 (-73.33%)
Mutual labels:  avro
jasvorno
A library for strong, schema based conversion between 'natural' JSON documents and Avro
Stars: ✭ 18 (-70%)
Mutual labels:  avro
registryless-avro-converter
An avro converter for Kafka Connect without a Schema Registry
Stars: ✭ 45 (-25%)
Mutual labels:  avro
DataProfiler
What's in your data? Extract schema, statistics and entities from datasets
Stars: ✭ 843 (+1305%)
Mutual labels:  avro
Insulator
A client UI to inspect Kafka topics, consume, produce and much more
Stars: ✭ 53 (-11.67%)
Mutual labels:  avro
darwin
Avro Schema Evolution made easy
Stars: ✭ 26 (-56.67%)
Mutual labels:  avro
avro-typescript
TypeScript Code Generator for Apache Avro Schema Types
Stars: ✭ 19 (-68.33%)
Mutual labels:  avro
javascript-serialization-benchmark
Comparison and benchmark of JavaScript serialization libraries (Protocol Buffer, Avro, BSON, etc.)
Stars: ✭ 54 (-10%)
Mutual labels:  avro
xml-avro
Convert XSD -> AVSC and XML -> AVRO
Stars: ✭ 32 (-46.67%)
Mutual labels:  avro

Chr.Avro

Chr.Avro is an Avro implementation for .NET. It’s designed to serve as a flexible alternative to the Apache implementation and integrate seamlessly with Confluent’s Kafka and Schema Registry clients.

For more information, check out the documentation.

Quick start

To use the command line interface: Install Chr.Avro.Cli as a global tool:

$ dotnet tool install Chr.Avro.Cli --global
You can invoke the tool using the following command: dotnet-avro
Tool 'chr.avro.cli' (version '8.1.1') was successfully installed.
$ dotnet avro help
Chr.Avro 8.1.1
...

To use the Kafka producer/consumer builders in your project: Add Chr.Avro.Confluent as a project dependency. After that, check out this guide or read on for some other examples.

Examples

The CLI can be used to generate Avro schemas for .NET types (both built-in and from compiled assemblies):

$ dotnet avro create -t System.Int32
"int"
$ dotnet avro create -t System.Decimal
{"type":"bytes","logicalType":"decimal","precision":29,"scale":14}
$ dotnet avro create -a out/example.dll -t ExampleRecord
{"name":"ExampleRecord","type":"record","fields":[{"name":"Number","type":"long"}]}

It can also verify that a .NET type can be mapped to a Schema Registry schema (useful for both development and CI):

$ dotnet avro registry-test -a out/example.dll -t ExampleRecord -r http://registry:8081 -i 242
A deserializer cannot be created for ExampleRecord: ExampleRecord does not have a field or property that matches the correlation_id field on example_record.

Extensions to the Confluent.Kafka ProducerBuilder and ConsumerBuilder configure Kafka clients to produce and consume Avro-encoded CLR objects:

using Chr.Avro.Confluent;
using Confluent.Kafka;
using Confluent.SchemaRegistry;
using System;
using System.Collections.Generic;

namespace Example
{
    class ExampleRecord
    {
        public Guid CorrelationId { get; set; }
        public DateTime Timestamp { get; set; }
    }

    class Program
    {
        static void Main(string[] args)
        {
            var consumerConfig = new ConsumerConfig()
            {
                BootstrapServers = "broker1:9092,broker2:9092",
                GroupId = "example_consumer_group"
            };

            var registryConfig = new SchemaRegistryConfig()
            {
                SchemaRegistryUrl = "http://registry:8081"
            };

            var builder = new ConsumerBuilder<string, ExampleRecord>(consumerConfig);

            using (var registry = new CachedSchemaRegistryClient(registryConfig))
            {
                builder.SetAvroKeyDeserializer(registry);
                builder.SetAvroValueDeserializer(registry);

                using (var consumer = builder.Build())
                {
                    var result = consumer.Consume(CancellationToken.None);
                    Console.WriteLine($"Consumed message! {result.Key}: {result.Value.Timestamp}");
                }
            }
        }
    }
}

Under the hood, SchemaBuilder is responsible for generating schemas from CLR types:

using Chr.Avro.Abstract;
using Chr.Avro.Representation;
using System;

namespace Example
{
    enum Fear
    {
        Bears,
        Children,
        Haskell,
    }

    struct FullName
    {
        public string FirstName { get; set; }
        public string LastName { get; set; }
    }

    class Person
    {
        public Guid Id { get; set; }
        public Fear GreatestFear { get; set; }
        public FullName Name { get; set; }
    }

    class Program
    {
        static void Main(string[] args)
        {
            var builder = new SchemaBuilder();
            var writer = new JsonSchemaWriter();

            Console.WriteLine(writer.Write(builder.BuildSchema<double>));
            // "double"

            Console.WriteLine(writer.Write(builder.BuildSchema<DateTime>));
            // "string"

            Console.WriteLine(writer.Write(builder.BuildSchema<Fear>));
            // {"name":"Fear","type":"enum","symbols":["Bears","Children","Haskell"]}

            Console.WriteLine(writer.Write(builder.BuildSchema<Person>));
            // {"name":"Person","type":"record"...}
        }
    }
}

For more complex examples, see the examples directory:

Contributing

Check out the contribution guidelines prior to opening an issue or creating a pull request. More information about the benchmark applications and documentation site can be found in their respective directories.

Cake handles all tasks related to building and publishing the Chr.Avro libraries and CLI. This repository doesn’t include bootstrapper scripts; installing and running Cake as a global tool is recommended:

$ dotnet tool install Cake.Tool --global
You can invoke the tool using the following command: dotnet-cake
Tool 'cake.tool' (version '0.35.0') was successfully installed.

$ dotnet cake build.cake --target=Pack
...

The following targets are supported:

Name Description
Analyze runs mdoc on the library projects and writes the results to docs/api
Benchmark runs the benchmark applications and writes the results to docs/benchmarks
Build builds all projects
Clean removes all build, documentation, and release artifacts
Pack creates NuGet packages for the library projects
Publish pushes packages to NuGet
Test runs the test projects
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].