All Projects → isopropylcyanide → Websockets-Vertx-Flink-Kafka

isopropylcyanide / Websockets-Vertx-Flink-Kafka

Licence: other
A simple request response cycle using Websockets, Eclipse Vert-x server, Apache Kafka, Apache Flink.

Programming Languages

java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to Websockets-Vertx-Flink-Kafka

vertx-vue-keycloak
This repo holds the source codes for the Medium Article "Vert.x + VueJS + OAuth2 in 5 steps"
Stars: ✭ 20 (+42.86%)
Mutual labels:  eventbus, vertx
Flink Recommandsystem Demo
🚁🚀基于Flink实现的商品实时推荐系统。flink统计商品热度,放入redis缓存,分析日志信息,将画像标签和实时记录放入Hbase。在用户发起推荐请求后,根据用户画像重排序热度榜,并结合协同过滤和标签两个推荐模块为新生成的榜单的每一个产品添加关联产品,最后返回新的用户列表。
Stars: ✭ 3,115 (+22150%)
Mutual labels:  flink, flink-kafka
Vertx Eventbus Java
A Vert.x EventBus client written in Java, works on Android
Stars: ✭ 20 (+42.86%)
Mutual labels:  eventbus, vertx
df data service
DataFibers Data Service
Stars: ✭ 31 (+121.43%)
Mutual labels:  vertx, flink
vxrifa
Utility library for Vert.X that allows using strong-typed interfaces in communication through EventBus
Stars: ✭ 15 (+7.14%)
Mutual labels:  eventbus, vertx
Vertx Zero
Zero Framework:http://www.vertxup.cn
Stars: ✭ 320 (+2185.71%)
Mutual labels:  eventbus, vertx
IntroduceToEclicpseVert.x
This repository contains the code of Vert.x examples contained in my articles published on platforms such as kodcu.com, medium, dzone. How to run each example is described in its readme file.
Stars: ✭ 27 (+92.86%)
Mutual labels:  eventbus, vertx
vertx-rabbitmq-client
Vert.x RabbitMQ Service
Stars: ✭ 71 (+407.14%)
Mutual labels:  vertx
logparser
Easy parsing of Apache HTTPD and NGINX access logs with Java, Hadoop, Hive, Pig, Flink, Beam, Storm, Drill, ...
Stars: ✭ 139 (+892.86%)
Mutual labels:  flink
elasticsearch-client
This client exposes the Elasticsearch Java High Level REST Client for Eclipse Vert.x applications.
Stars: ✭ 40 (+185.71%)
Mutual labels:  vertx
green-annotations
An Android Annotations plugin to support Green Robot.
Stars: ✭ 21 (+50%)
Mutual labels:  eventbus
RxBus
🍾 标签/线程/Kotlin/自动注销的RxBus
Stars: ✭ 25 (+78.57%)
Mutual labels:  eventbus
Farseer.Net
Provides consistent standard use of common components of the .Net Core language
Stars: ✭ 42 (+200%)
Mutual labels:  eventbus
parquet-flinktacular
How to use Parquet in Flink
Stars: ✭ 29 (+107.14%)
Mutual labels:  flink
apache-flink-jdbc-streaming
Sample project for Apache Flink with Streaming Engine and JDBC Sink
Stars: ✭ 22 (+57.14%)
Mutual labels:  flink
spring-webflux-research
spring webflux research
Stars: ✭ 42 (+200%)
Mutual labels:  vertx
openapi4j
OpenAPI 3 parser, JSON schema and request validator.
Stars: ✭ 92 (+557.14%)
Mutual labels:  vertx
seatunnel-example
seatunnel plugin developing examples.
Stars: ✭ 27 (+92.86%)
Mutual labels:  flink
flink-connector-kudu
基于Apache-bahir-kudu-connector的flink-connector-kudu,支持Flink1.11.x DynamicTableSource/Sink,支持Range分区等
Stars: ✭ 40 (+185.71%)
Mutual labels:  flink
SANSA-Stack
Big Data RDF Processing and Analytics Stack built on Apache Spark and Apache Jena http://sansa-stack.github.io/SANSA-Stack/
Stars: ✭ 130 (+828.57%)
Mutual labels:  flink

Websockets-Vertx-Kafka-Flink

A simple request response cycle using Websockets, Eclipse Vert-x server, Apache Kafka, Apache Flink.


  • An incoming request gets routed to a non blocking Vertx server which then writes the request to a specific Kafka topic.
  • A Flink consumer implemented as another side project consumes the messages from the given request topic
  • (Optional) Flink job hits a Rest API hosted on a Spring boot server. You can use Jax-Rs or even hardcode the response
  • Flink writes the API result to another topic. Every message has a unique sender id. Flink sends the response with the same
  • Finally the Vertx Kafka consumer listens for responses from the response topic and sends an event to a websocket handler
  • Websocket consumer for a specific id writes the response to its own socket thus completing the entire async request cycle

image image


Prerequisites

  • Java 1.8
  • Apache Kafka 2.0.0
  • Apache Zookeeper 3.4.8
  • Eclipse Vertx 3.5.3
  • Apache Flink 1.6.0

Setting up Apache Kafka

  # Start Zookeeper instance 
  $ zookeeper-server-start.bat ..\..\config\zookeeper.properties
  
  # Start Kafka server
  $ kafka-server-start.bat ..\..\config\server.properties
  
  # Create a request topic
  $ kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 3 --topic flink-demo

  # Create a response queue
  $ kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 3 --topic flink-demo-resp
  
  # Verify the consumer of request queue flink-demo
  $ kafka-console-consumer.bat --bootstrap-server localhost:9092 --from-beginning --topic flink-demo

  # Verify the consumer of response queue flink-demo-resp
  $ kafka-console-consumer.bat --bootstrap-server localhost:9092 --from-beginning --topic flink-demo-resp
  

Make sure following is appended to config\server.properties

port = 9092
advertised.host.name = localhost 

Note: Replace .bat files with .sh files when working in a UNIX environment.


What you do in the Flink Job depends on the use case. Options are

  • Make async rest API call
  • Interact with a database using an async clients
  • Return a mock response

Caveats

  • Here, we are making a request using the AsyncHTTP Client to an endpoint hosted on a Spring Boot Server
  • The rest API Server is listening on port 9004
  • You are free to experiment in this department.
  • If you choose to continue using the Rest API given in this project, make sure you have an endpoint implementation.

Setting up the project

  • Run the kafka-flink connector project that waits for incoming data stream from kafka queue "flink_resp"
  • Run the ws-vertx project that invokes an event on the event bus which writes a sample API request to the topic.
  • Verify that the message is written correctly on the topic "flink-demo"
  • Flink Kafka connector consumes the message, serializes it, transforms the data stream into a response stream
  • Flink job now writes the response back to the response topic "flink-demo-resp"

Testing the web socket flow

  • Incuded within the vertx flow is a client socket verticle that emulates a single web socket request
  • It is fired as soon as the server verticle is deployed. [Optional] Look for the following
  # Uncomment the below line for local UI testing: It creates a websocket request to the given server
 //vertx.deployVerticle(new ClientSocketRequestVerticle());

  • You can however choose to send websocket requests from a client manually. Use the following
   # Use the following websocket URL
   ws://127.0.0.1:9443/wsapi/register

   # Once the socket opens, begin sending messages in the correct format
   {
	    "email": "your email",
	    "password": "your password ",
	    "registerAsAdmin": true
   }


Websockets

  • Websocket for communication between app & backend
  • Async messages, non-blocking communication layer
  • Full duplex communication channels over single TCP

Vert-x

  • A toolkit ecosystem, to build reactive application on JVM
  • Vert-x library helps implement non-blocking asynchronous event bus implementation.
  • Helps manage Websocket queue

Kafka

  • Distributed streaming platform.
  • Kafka provides a fully integrated Streams API to allow an application to act as a stream processor, consuming an input stream from one or more topics and producing an output stream to one or more output topics, effectively transforming the input streams to output streams.
  • Handles out-of-order data.

Flink

  • Open-source platform for distributed stream and batch data processing.
  • Provides data distribution, communication, and fault tolerance for distributed computations over data streams.
  • Builds batch processing on top of the streaming engine, overlaying native iteration support, managed memory, and program optimization.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].