All Projects → shikhar → kafka-connect-dynamodb

shikhar / kafka-connect-dynamodb

Licence: Apache-2.0 License
Kafka Connector for DynamoDB [unmaintained]

Programming Languages

java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to kafka-connect-dynamodb

Kafka Connect Mongodb
**Unofficial / Community** Kafka Connect MongoDB Sink Connector - Find the official MongoDB Kafka Connector here: https://www.mongodb.com/kafka-connector
Stars: ✭ 137 (+356.67%)
Mutual labels:  connector, kafka-connect
Streamx
kafka-connect-s3 : Ingest data from Kafka to Object Stores(s3)
Stars: ✭ 96 (+220%)
Mutual labels:  connector, kafka-connect
Stream Reactor
Streaming reference architecture for ETL with Kafka and Kafka-Connect. You can find more on http://lenses.io on how we provide a unified solution to manage your connectors, most advanced SQL engine for Kafka and Kafka Streams, cluster monitoring and alerting, and more.
Stars: ✭ 753 (+2410%)
Mutual labels:  connector, kafka-connect
kafka-connect-http
Kafka Connect connector that enables Change Data Capture from JSON/HTTP APIs into Kafka.
Stars: ✭ 81 (+170%)
Mutual labels:  connector, kafka-connect
oracdc
Oracle database CDC (Change Data Capture)
Stars: ✭ 51 (+70%)
Mutual labels:  kafka-connect
debezium-incubator
Previously used repository for new Debezium modules and connectors in incubation phase (archived)
Stars: ✭ 89 (+196.67%)
Mutual labels:  kafka-connect
microservices-observability
🎉 Microservices Observability - Log Aggregation, Distributed Tracking, Metrics
Stars: ✭ 40 (+33.33%)
Mutual labels:  kafka-connect
super-serverless-sample
Backend serverless que simula o sistema de votação do BBB
Stars: ✭ 30 (+0%)
Mutual labels:  dynamodb
dynamodb-api
DynamoDB API wrapper gem. like SQL and simply.
Stars: ✭ 15 (-50%)
Mutual labels:  dynamodb
APIConnectors
A curated list of example code to collect data from Web APIs using DataPrep.Connector.
Stars: ✭ 22 (-26.67%)
Mutual labels:  connector
MySQL Module
MySQL connector to Godot Engine.
Stars: ✭ 30 (+0%)
Mutual labels:  connector
hyper-kube-config
H Y P E R K U B E - A Serverless API and kubectl plugin providing a storage and retrieval Kubernetes cluster credentials. Hyperkube leverages AWS Secrets Manager for storing credential information.
Stars: ✭ 27 (-10%)
Mutual labels:  dynamodb
kafka-compose
🎼 Docker compose files for various kafka stacks
Stars: ✭ 32 (+6.67%)
Mutual labels:  kafka-connect
kafka-connect-iot-mqtt-connector-example
Internet of Things Integration Example => Apache Kafka + Kafka Connect + MQTT Connector + Sensor Data
Stars: ✭ 170 (+466.67%)
Mutual labels:  kafka-connect
road-to-orleans
This repository illustrates the road to orleans with practical, real-life examples. From most basic, to more advanced techniques.
Stars: ✭ 55 (+83.33%)
Mutual labels:  dynamodb
graphql-pynamodb
Graphene PynamoDB Integration
Stars: ✭ 63 (+110%)
Mutual labels:  dynamodb
crynamo
DynamoDB client for Crystal.
Stars: ✭ 14 (-53.33%)
Mutual labels:  dynamodb
serverless-grocery-app
Grocery Purchase app with serverless AWS Lambda + DynamoDB + React
Stars: ✭ 28 (-6.67%)
Mutual labels:  dynamodb
iis
开源微博平台 Open source cloud powered microblog
Stars: ✭ 42 (+40%)
Mutual labels:  dynamodb
twitter
A serverless social network that's under development with some cool stuff, such as Serverless Framework, AppSync, GraphQL, Lambda, DynamoDB, Cognito, Kinesis Firehose, and Algolia ☁️
Stars: ✭ 29 (-3.33%)
Mutual labels:  dynamodb

kafka-connect-dynamodb is a Kafka Connector for loading data to and from Amazon DynamoDB.

It is implemented using the AWS Java SDK for DynamoDB. For authentication, the DefaultAWSCredentialsProviderChain is used.

Building

Run:

$ mvn clean package

Then you will find this connector and required JARs it depends upon in target/kafka-connect-dynamodb-$version-SNAPSHOT-package/share/java/kafka-connect-dynamodb/*.

To create an uber JAR:

$ mvn -P standalone clean package

The uber JAR will be created at target/kafka-connect-dynamodb-$version-SNAPSHOT-standalone.jar.

Sink Connector

Example configuration

Ingest the orders topic to a DynamoDB table of the same name in the specified region:

name=dynamodb-sink-test
topics=orders
connector.class=dynamok.sink.DynamoDbSinkConnector
region=us-west-2
ignore.record.key=true

Record conversion

Refer to DynamoDB Data Types.

At the top-level, we need to either be converting to the DynamoDB Map data type, or the top.key.attribute or top.value.attribute configuration options for the Kafka record key or value as applicable should be configured, so we can ensure being able to hoist the converted value as a DynamoDB record.

Schema present

Connect Schema Type DynamoDB
INT8, INT16, INT32, INT64, FLOAT32, FLOAT64, Decimal Number
BOOL Boolean
BYTES Binary
STRING String
ARRAY List
MAP [1], STRUCT Map
[1]Map keys must be primitive types, and cannot be optional.

null values for optional schemas are translated to the Null type.

Schemaless

Java DynamoDB
null Null
Number [2] Number
Boolean Boolean
byte[], ByteBuffer Binary
String String
List List
Empty Set [3] Null
Set<String> String Set
Set<Number> Number Set
Set<byte[]>, Set<ByteBuffer> Binary Set
Map [4] Map

Any other datatype will result in the connector to fail.

[2]i.e. Byte, Short, Integer, Long, Float, Double, BigInteger, BigDecimal
[3]It is not possible to determine the element type of an empty set.
[4]Map keys must be primitive types, and cannot be optional.

Configuration options

region

AWS region for DynamoDB.

  • Type: string
  • Default: ""
  • Importance: high
access.key.id

Explicit AWS access key ID. Leave empty to utilize the default credential provider chain.

  • Type: password
  • Default: [hidden]
  • Importance: low
secret.key

Explicit AWS secret access key. Leave empty to utilize the default credential provider chain.

  • Type: password
  • Default: [hidden]
  • Importance: low
batch.size

Batch size between 1 (dedicated PutItemRequest for each record) and 25 (which is the maximum number of items in a BatchWriteItemRequest)

  • Type: int
  • Default: 1
  • Importance: high
kafka.attributes

Trio of topic,partition,offset attribute names to include in records, set to empty to omit these attributes.

  • Type: list
  • Default: [kafka_topic, kafka_partition, kafka_offset]
  • Importance: high
table.format

Format string for destination DynamoDB table name, use ${topic} as placeholder for source topic.

  • Type: string
  • Default: "${topic}"
  • Importance: high
ignore.record.key

Whether to ignore Kafka record keys in preparing the DynamoDB record.

  • Type: boolean
  • Default: false
  • Importance: medium
ignore.record.value

Whether to ignore Kafka record value in preparing the DynamoDB record.

  • Type: boolean
  • Default: false
  • Importance: medium
top.key.attribute

DynamoDB attribute name to use for the record key. Leave empty if no top-level envelope attribute is desired.

  • Type: string
  • Default: ""
  • Importance: medium
top.value.attribute

DynamoDB attribute name to use for the record value. Leave empty if no top-level envelope attribute is desired.

  • Type: string
  • Default: ""
  • Importance: medium
max.retries

The maximum number of times to retry on errors before failing the task.

  • Type: int
  • Default: 10
  • Importance: medium
retry.backoff.ms

The time in milliseconds to wait following an error before a retry attempt is made.

  • Type: int
  • Default: 3000
  • Importance: medium

Source Connector

Example configuration

Ingest all DynamoDB tables in the specified region, to Kafka topics with the same name as the source table:

name=dynamodb-source-test
connector.class=dynamok.source.DynamoDbSourceConnector
region=us-west-2

Record conversion

TODO describe conversion scheme

Limitations

DynamoDB records containing heterogeneous lists (L) or maps (M) are not currently supported, these fields will be silently dropped. It will be possible to add support for them with the implementation of KAFKA-3910.

Configuration options

region

AWS region for DynamoDB.

  • Type: string
  • Default: ""
  • Importance: high
access.key.id

Explicit AWS access key ID. Leave empty to utilize the default credential provider chain.

  • Type: password
  • Default: [hidden]
  • Importance: low
secret.key

Explicit AWS secret access key. Leave empty to utilize the default credential provider chain.

  • Type: password
  • Default: [hidden]
  • Importance: low
topic.format

Format string for destination Kafka topic, use ${table} as placeholder for source table name.

  • Type: string
  • Default: "${table}"
  • Importance: high
tables.prefix

Prefix for DynamoDB tables to source from.

  • Type: string
  • Default: ""
  • Importance: medium
tables.whitelist

Whitelist for DynamoDB tables to source from.

  • Type: list
  • Default: ""
  • Importance: medium
tables.blacklist

Blacklist for DynamoDB tables to source from.

  • Type: list
  • Default: ""
  • Importance: medium
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].