All Projects → danielgerlag → Conductor

danielgerlag / Conductor

Licence: mit
Distributed workflow server

Projects that are alternatives of or similar to Conductor

Arvados
An open source platform for managing and analyzing biomedical big data
Stars: ✭ 274 (-2.49%)
Mutual labels:  workflow-engine, workflow
Flor
a workflow engine
Stars: ✭ 190 (-32.38%)
Mutual labels:  workflow-engine, workflow
Viewflow
Reusable workflow library for Django
Stars: ✭ 2,136 (+660.14%)
Mutual labels:  workflow-engine, workflow
Etl unicorn
数据可视化, 数据挖掘, 数据处理 ETL
Stars: ✭ 156 (-44.48%)
Mutual labels:  workflow-engine, workflow
tumbleweed
Lightweight workflow engine microservice implement BPMN 2.0
Stars: ✭ 23 (-91.81%)
Mutual labels:  workflow, workflow-engine
Batchflow
BatchFlow helps you conveniently work with random or sequential batches of your data and define data processing and machine learning workflows even for datasets that do not fit into memory.
Stars: ✭ 156 (-44.48%)
Mutual labels:  workflow-engine, workflow
Kogito Runtimes
Kogito Runtimes - Kogito is a cloud-native business automation technology for building cloud-ready business applications.
Stars: ✭ 188 (-33.1%)
Mutual labels:  workflow-engine, workflow
Microwf
A simple finite state machine (FSM) with workflow character where you define your workflows in code.
Stars: ✭ 122 (-56.58%)
Mutual labels:  workflow-engine, workflow
zenaton-ruby
💎 Ruby gem to run and orchestrate background jobs with Zenaton Workflow Engine
Stars: ✭ 32 (-88.61%)
Mutual labels:  workflow, workflow-engine
Aiida Core
The official repository for the AiiDA code
Stars: ✭ 238 (-15.3%)
Mutual labels:  workflow-engine, workflow
Django Lb Workflow
Reusable workflow library for Django
Stars: ✭ 153 (-45.55%)
Mutual labels:  workflow-engine, workflow
CaseManagement
CMMN engine implementation in dotnet core
Stars: ✭ 16 (-94.31%)
Mutual labels:  workflow, workflow-engine
Zeebe
Distributed Workflow Engine for Microservices Orchestration
Stars: ✭ 2,165 (+670.46%)
Mutual labels:  workflow-engine, workflow
Workflow core
[Deprecated, use flor_core instead] A Rails engine which providing essential infrastructure of workflow. It's based on Workflow Nets.
Stars: ✭ 171 (-39.15%)
Mutual labels:  workflow-engine, workflow
Microflow
Lightweight workflow engine
Stars: ✭ 129 (-54.09%)
Mutual labels:  workflow-engine, workflow
Cuneiform
Cuneiform distributed programming language
Stars: ✭ 175 (-37.72%)
Mutual labels:  workflow-engine, workflow
Pyflow
A lightweight parallel task engine
Stars: ✭ 108 (-61.57%)
Mutual labels:  workflow-engine, workflow
Openmole
Workflow engine for exploration of simulation models using high throughput computing
Stars: ✭ 120 (-57.3%)
Mutual labels:  workflow-engine, workflow
Pallets
Simple and reliable workflow engine, written in Ruby
Stars: ✭ 216 (-23.13%)
Mutual labels:  workflow-engine, workflow
nactivity
workflow engine activity activiti
Stars: ✭ 55 (-80.43%)
Mutual labels:  workflow, workflow-engine

Conductor

Conductor is a workflow server built upon Workflow Core that enables you to coordinate multiple services and scripts into workflows so that you can rapidly create complex workflow applications. Workflows are composed of a series of steps, with an internal data object shared between them to pass information around. Conductor automatically runs and tracks each step, and retries when there are errors.

Workflows are written in either JSON or YAML and then added to Conductor's internal registry via the definition API. Then you use the workflow API to invoke them with or without custom data.

Installation

Conductor is available as a Docker image - danielgerlag/conductor

Conductor uses MongoDB as it's datastore, you will also need an instance of MongoDB in order to run Conductor.

Use this command to start a container (with the API available on port 5001) that points to mongodb://my-mongo-server:27017/ as it's datastore.

$ docker run -p 127.0.0.1:5001:80/tcp --env dbhost=mongodb://my-mongo-server:27017/ danielgerlag/conductor

If you wish to run a fleet of Conductor nodes, then you also need to have a Redis instance, which they will use as a backplane. This is not required if you are only running one instance. Simply have all your conductor instances point to the same MongoDB and Redis instance, and they will operate as a load balanced fleet.

Environment Variables to configure

You can configure the database and Redis backplane by setting environment variables.

dbhost: <<insert connection string to your MongoDB server>>
redis: <<insert connection string to your Redis server>> (optional)

If you would like to setup a conductor container (API on port 5001) and a MongoDB container at the same time and have them linked, use this docker compose file:

version: '3'
services:
  conductor:
    image: danielgerlag/conductor
    ports:
    - "5001:80"
    links:
    - mongo
    environment:
      dbhost: mongodb://mongo:27017/
  mongo:
    image: mongo

Quick example

We'll start by defining a simple workflow that will log "Hello world" as it's first step and then "Goodbye!!!" as it's second and final step. We POST the definition to api/definition in either YAML or JSON.

POST /api/definition
Content-Type: application/yaml
Id: Hello1
Steps:
- Id: Step1
  StepType: EmitLog
  NextStepId: Step2
  Inputs:
    Message: '"Hello world"'
    Level: '"Information"'
- Id: Step2
  StepType: EmitLog
  Inputs:
    Message: '"Goodbye!!!"'
    Level: '"Information"'

Now, lets test it by invoking a new instance of our workflow. We do this with a POST to /api/workflow/Hello1

POST /api/workflow/Hello1

We can also rewrite our workflow to pass custom data to any input on any of it's steps.

Id: Hello2
Steps:
- Id: Step1
  StepType: EmitLog
  Inputs:
    Message: data.CustomMessage
    Level: '"Information"'

Now, when we start a new instance of the workflow, we also initialize it with some data.

POST /api/workflow/Hello2
Content-Type: application/x-yaml
CustomMessage: foobar

Further reading

Resources

License

This project is licensed under the MIT License - see the LICENSE.md file for details

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].