All Projects → fbacchella → LogHub

fbacchella / LogHub

Licence: other
No description or website provided.

Programming Languages

java
68154 projects - #9 most used programming language
ANTLR
299 projects
javascript
184084 projects - #8 most used programming language
XSLT
1337 projects
HTML
75241 projects
CSS
56736 projects

Projects that are alternatives of or similar to LogHub

vector
A high-performance observability data pipeline.
Stars: ✭ 12,138 (+31842.11%)
Mutual labels:  events, pipeline
Vector
A reliable, high-performance tool for building observability data pipelines.
Stars: ✭ 8,736 (+22889.47%)
Mutual labels:  events, pipeline
cyberevents
The protocol for EVENTs and TICKETs
Stars: ✭ 16 (-57.89%)
Mutual labels:  events
transceiver
Channel based event bus with request/reply pattern, using promises. For node & browser.
Stars: ✭ 25 (-34.21%)
Mutual labels:  events
from-event
🦊 ViewChild and FromEvent — a Match Made in Angular Heaven
Stars: ✭ 130 (+242.11%)
Mutual labels:  events
modelbox
A high performance, high expansion, easy to use framework for AI application. 为AI应用的开发者提供一套统一的高性能、易用的编程框架,快速基于AI全栈服务、开发跨端边云的AI行业应用。
Stars: ✭ 48 (+26.32%)
Mutual labels:  pipeline
node-cqrs-saga
Node-cqrs-saga is a node.js module that helps to implement the sagas in cqrs. It can be very useful as domain component if you work with (d)ddd, cqrs, eventdenormalizer, host, etc.
Stars: ✭ 59 (+55.26%)
Mutual labels:  events
hms-av-pipeline-demo
HUAWEI AV Pipeline Kit sample code project, which contains the Java sample code to implement functions like video playback, video super-resolution and media asset management. C++ sample code is contained for calling MediaFilter to use the sound event detection plugin.
Stars: ✭ 14 (-63.16%)
Mutual labels:  pipeline
watermill-sql
SQL Pub/Sub for the Watermill project.
Stars: ✭ 39 (+2.63%)
Mutual labels:  events
nodejs-docker-example
An example of how to run a Node.js project in Docker in a Buildkite pipeline
Stars: ✭ 39 (+2.63%)
Mutual labels:  pipeline
event
📆 Strictly typed event emitter with asynciterator support
Stars: ✭ 30 (-21.05%)
Mutual labels:  events
GeneLab Data Processing
No description or website provided.
Stars: ✭ 32 (-15.79%)
Mutual labels:  pipeline
Pulsar
Open source VFX pipeline tool
Stars: ✭ 20 (-47.37%)
Mutual labels:  pipeline
bookmarks
A PySide2 based file and asset manager for animation and CG productions.
Stars: ✭ 33 (-13.16%)
Mutual labels:  pipeline
scrnaseq
A single-cell RNAseq pipeline for 10X genomics data
Stars: ✭ 60 (+57.89%)
Mutual labels:  pipeline
drop
Pipeline to find aberrant events in RNA-Seq data, useful for diagnosis of rare disorders
Stars: ✭ 69 (+81.58%)
Mutual labels:  pipeline
eventkit
Event-driven data pipelines
Stars: ✭ 94 (+147.37%)
Mutual labels:  pipeline
EDTA
Extensive de-novo TE Annotator
Stars: ✭ 210 (+452.63%)
Mutual labels:  pipeline
pg-pubsub
Reliable PostgreSQL LISTEN/NOTIFY with inter-process lock support
Stars: ✭ 50 (+31.58%)
Mutual labels:  events
unity-build-pipeline
Custom BASH script for build, archive, export and upload APK and IPA to server with Telegram notification
Stars: ✭ 59 (+55.26%)
Mutual labels:  pipeline

LogHub

Loghub is a pipeline log, close to logstash. But it's written in java for improved stability and performance.

It received events from external sources, process them and send them.

All components are organized in many pipeline that can be interconnect. A pipeline goes from one receiver source that generate events, send through processor and forward them to a sender or another pipeline.

Receiver source uses decoders that takes bytes messages and generate a event from that.

Sender source uses decoders that take event and produce bytes message that are then send to the configured destination.

All of these five kind of operator (Receivers, Senders, Processors, Coders and Decoders) are java classes that can be derived for custom usages.

For configuration it uses a DSL generated using antlr. It's syntax is a strange mix of logstash configuration files, java and a small tast of groovy. The exact grammar can be found at https://github.com/fbacchella/LogHub/blob/master/src/main/antlr4/loghub/Route.g4.

It look like:

input {
    loghub.receivers.ZMQ {
        listen: "tcp://localhost:2120",
        decoder: loghub.decoders.Log4j
    }
} | $main
input {
    loghub.receivers.Udp {
        port: 2121
        decoder: loghub.decoders.Msgpack
    }
} | $apache

output $main | { loghub.senders.ElasticSearch }

pipeline[apache] { loghub.processors.Geoip { datfilepath:"/user/local/share/GeoIP/GeoIP.dat", locationfield:"location", threads:4 } }
pipeline[main] {
     loghub.processors.Log { threads: 2 }
    | event.logger_name == "jrds.starter.Timer" || event.info > 4 ? loghub.processors.Drop  : ( loghub.processors.ParseJson | loghub.processors.Groovy { script: "println event['logger_name']" } )
}
extensions: "/usr/share/loghub/plugins:/usr/share/loghub/scripts"

This configuration define two receivers, one that listen using 0MQ for log4j events. The other listen for msgpack encoded events on a udp port, like some that can be generated by mod_log_net.

The events received on UDP are send to one pipeline called "apache". All the events are transfered to the default "main" pipeline after resolving location from visitors.

The log4j events are directly send to the main pipeline, that does some magic treatment on it. Pay attention to the test. It will be evaluated as a groovy scripts.

A property called "extensions" is defined. It allows to define custom extensions folders that will be used to resolve scripts and added to the class path.

In the configuration file, all the agent are defined using directly the class name.

If needed, slow or CPU bound processor can be given more dedicated threads by specifying a specific number of threads. They will be still one processor class instance, but many threads will send events to it.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].