All Projects → mguenther → spring-kafka-event-sourcing-sampler

mguenther / spring-kafka-event-sourcing-sampler

Licence: other
Showcases how to build a small Event-sourced application using Spring Boot, Spring Kafka, Apache Avro and Apache Kafka

Programming Languages

java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to spring-kafka-event-sourcing-sampler

fmodel-ts
Functional Domain Modeling with Typescript
Stars: ✭ 41 (+24.24%)
Mutual labels:  cqrs, eventsourcing
factcast
This project is archived. A friendly fork can be found at https://github.com/factcast/factcast/
Stars: ✭ 14 (-57.58%)
Mutual labels:  cqrs, eventsourcing
micro
Functional prooph for microservices
Stars: ✭ 53 (+60.61%)
Mutual labels:  cqrs, eventsourcing
tictactoe-microservices-example
An example of Spring Cloud Microservices application based on books (see Links section)
Stars: ✭ 23 (-30.3%)
Mutual labels:  zuul, eureka
delta
DDD-centric event-sourcing library for the JVM
Stars: ✭ 15 (-54.55%)
Mutual labels:  cqrs, eventsourcing
eventsourcing-go
Event Sourcing + CQRS using Golang Tutorial
Stars: ✭ 75 (+127.27%)
Mutual labels:  cqrs, eventsourcing
pdo-snapshot-store
PDO Snapshot Store
Stars: ✭ 24 (-27.27%)
Mutual labels:  cqrs, eventsourcing
spring-microservices
Spring Cloud Micro Services with Eureka Discovery, Zuul Proxy, OAuth2 Security, Hystrix CircuitBreaker, Sleuth Zipkin, ELK Stack Logging, Kafka, Docker and many new features
Stars: ✭ 114 (+245.45%)
Mutual labels:  zuul, eureka
SplitetFramework
Splitet is a Java based Event Sourcing framework which can be benefited by the teams who are planning to make CQRS transitions with minimum learning curve and ease of adaptation.
Stars: ✭ 159 (+381.82%)
Mutual labels:  cqrs, eventsourcing
les
Go directly from an event storming to a working API: Event Markdown / Markup validation & NodeJS CQRS/ES application builder.
Stars: ✭ 48 (+45.45%)
Mutual labels:  cqrs, eventsourcing
event-store-mgmt-ui
Event Store Management UI
Stars: ✭ 23 (-30.3%)
Mutual labels:  cqrs, eventsourcing
sample-axon-kafka
Sample CQRS and event sourced application developed on top of axon framework.(Kafka is used for distributing the events)
Stars: ✭ 31 (-6.06%)
Mutual labels:  cqrs, eventsourcing
awesome cqrs
some links about CQRS / Event Sourcing
Stars: ✭ 61 (+84.85%)
Mutual labels:  cqrs, eventsourcing
eventuous
Minimalistic Event Sourcing library for .NET
Stars: ✭ 236 (+615.15%)
Mutual labels:  cqrs, eventsourcing
spring-microservices
Example of a microservice architecture using Spring Cloud
Stars: ✭ 76 (+130.3%)
Mutual labels:  zuul, eureka
spring-boot-microservice-eureka-zuul-docker-gateway-kubernetes
Spring Boot rest microservices using Kubernetes, ConfigMap, Eureka, Zuul / Spring Boot Gateway, Docker. Monitoring with logstash, logback, elasticsearch, kibana.
Stars: ✭ 86 (+160.61%)
Mutual labels:  zuul, eureka
nota
"None Of The Above" - is going to be a secure online voting system, intended to give the electorate better choices. It always adds one additional choice to anything to be voted on: If more than 50% of voters choose "None of the Above", the election is considered null and void.
Stars: ✭ 17 (-48.48%)
Mutual labels:  cqrs, eventsourcing
nestjs-boilerplate-microservice
Nestjs Microservice boilerplate: apply DDD, CQRS, and Event Sourcing within an event driven architecture
Stars: ✭ 270 (+718.18%)
Mutual labels:  cqrs, eventsourcing
microservices-v8
Learn Microservices with Spring Boot - v8
Stars: ✭ 32 (-3.03%)
Mutual labels:  zuul, eureka
cqrs-typescript
CQRS implementation in typescript
Stars: ✭ 29 (-12.12%)
Mutual labels:  cqrs, eventsourcing

Event Sourcing using Spring Kafka

Build Status

This repository contains a sample application that demonstrates how to implement an Event-sourced systems using the CQRS architectural style. The solution uses Apache Kafka, which we easily integrate into a Spring Boot based application using Spring for Apache Kafka (2.6.5), Apache Avro for event serialization and deserialization and uses an in-memory H2 database that contributes to the query side of our CQRS-based system. The application itself is minimal and implements a subset of David Allen's Getting Things Done time management method.

The code presented in this repository is the joint work of Boris Fresow and Markus Günther as part of an article series on Building Event-based applications with Spring Kafka for the German JavaMagazin.

Prerequisites

Running the showcase requires a working installation of Apache ZooKeeper and Apache Kafka. We provide Dockerfiles for both of them to get you started easily. Please make sure that Docker as well as Docker Compose are installed on your system.

Versions

Application Version Docker Image
Apache Kafka 2.6.0 wurstmeister/kafka:2.13-2.6.0
Apache ZooKeeper 3.4.13 wurstmeister/zookeeper

Building and Running the Containers

Before you execute the code samples, make sure that you have a working environment running. If you have not done it already, use the script docker/build-images to create Docker images for all required applications. After a couple of minutes, you should be ready to go.

Once the images have been successfully built, you can start the resp. containers using the provided docker-compose script. Simply issue

$ docker-compose up

for starting Apache Kafka, Apache Zookeeper and Yahoo Kafka Manager. Stopping the containers is best done using a separate terminal and issueing the following commands.

$ docker-compose stop
$ docker-compose rm

The final rm operation deletes the containers and thus clears all state so you can start over with a clean installation.

For simplicity, we restrict the Kafka cluster to a single Kafka broker. However, scaling to more Kafka brokers is easily done via docker-compose. You will have to provide a sensible value for KAFKA_ADVERTISED_HOST_NAME (other than localhost) for this to work, though.

$ docker-compose scale kafka=3   # scales up to 3 Kafka brokers
$ docker-compose scale kafka=1   # scales down to 1 Kafka broker after the previous upscale

After changing the number of Kafka brokers, give the cluster some time so that all brokers can finish their cluster-join procedure. This should complete in a couple of seconds and you can inspect the output of the resp. Docker containers just to be sure that everything is fine. Kafka Manager should also reflect the change in the number of Kafka brokers after they successfully joined the cluster.

Using the API

Running the provided docker-compose will fire up a couple of services. First of all, Apache Kafka as well as Apache ZooKeeper, then both the command and query side of the GTD application as well as to small services for the sake of service discovery and to unify the API. The API gateway is listening at localhost:8765. You will have to interact with the API gateway, which takes care of the proper routing to one instance of the command or the query side of the application.

Overview

API Endpoint Method Example
/items POST Creates a new item.
/items GET Lists all items that are currently managed.
/items/{itemId} GET Lists the details of a specific item.
/items/{itemId} PUT Modifies an existing item.
/items/{itemId} DELETE Closes an existing item.

The following sections will walk you through a simple example on how to use the API via cURL.

Creating a new item

To create new item, we simply have to provide a short description of it in JSON along with the HTTP payload.

{
  "description": "Go shopping"
}

Using cURL we can create the item:

$ curl http://localhost:8765/api/items -X POST -H "Content-Type: application/json" -d '{"description":"Go shopping"}'

This request will be routed to an instance of the command-side of the GTD application, where the command will be validated before the proper event will be persisted to the event log.

Retrieving a list of all items

After creating an item, we'd like to inspect what items our GTD application currently manages. There is an HTTP endpoint for that as well. If you issue the following cURL request

$ curl http://localhost:8765/api/items

you see something along the lines of the following output (pretty-printed).

[
  {
    "id": "07bad2d",
    "description": "Go shopping",
    "requiredTime": 0,
    "dueDate": null,
    "tags": [      
    ],
    "associatedList": null,
    "done": false
  }
]

This shows the item we just created in full detail.

Retrieving a single item

We can retrieve details of a dedicated item as well. With the next cURL command, we request the item details for the item we just created (id: 07bad2d).

$ curl http://localhost:8765/api/items/07bad2d

This yields the following output (pretty-printed):

{
  "id": "07bad2d",
  "description": "Go shopping",
  "requiredTime": 0,
  "dueDate": null,
  "tags": [
      ],
  "associatedList": null,
  "done": false
}

Modifying an existing item

Let's try to update the item and associate it with a list of tags, a required time and put it into a dedicated list. The payload for this update looks like this:

{
  "tags": ["weekly"],
  "associatedList": "project",
  "requiredTime": 5
}

To issue the update, we simply execute the following cURL command.

$ curl http://localhost:8765/api/items/07bad2d -X PUT -H "Content-Type:application/json" -d '{"tags": ["weekly"], "associatedList":"project", "requiredTime":5}'

This will validate the individiual update commands extracted from the payload against the current state of the item. If the validation holds, the respective events will be emitted and the state of the item will be updated. If we look at the details of the item again using

$ curl http://localhost:8765/api/items/07bad2d

we see that the update has been successfully applied.

{
  "id": "07bad2d",
  "description": "Go shopping",
  "requiredTime": 5,
  "dueDate": null,
  "tags": [
    "weekly"
  ],
  "associatedList": "project",
  "done": false
}

Closing an item

To close an item, we issue a DELETE request via cURL.

$ curl http://localhost:8765/api/items/07bad2d -X DELETE

Looking again at the details of the item, we see that its done attribute is now true.

{
  "id": "07bad2d",
  "description": "Go shopping",
  "requiredTime": 5,
  "dueDate": null,
  "tags": [
    "weekly"
  ],
  "associatedList": "project",
  "done": true
}

License

This work is released under the terms of the MIT license.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].