All Projects → JoseTomasTocino → Yotaq

JoseTomasTocino / Yotaq

yotaq - Your Own Task Queue for Python

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Yotaq

concurrent-tasks
A simple task runner which will run all tasks till completion, while maintaining concurrency limits.
Stars: ✭ 27 (-92.04%)
Mutual labels:  task-runner, task-queue
celery.node
Celery task queue client/worker for nodejs
Stars: ✭ 164 (-51.62%)
Mutual labels:  task-runner, task-queue
bikeshed
Lock free hierarchical work scheduler
Stars: ✭ 78 (-76.99%)
Mutual labels:  task-runner
myke
make with yaml: development tasks made simple with golang, yaml and many ingredients
Stars: ✭ 67 (-80.24%)
Mutual labels:  task-runner
simple-targets-csx
⊙ A minimalist target runner for C# scripts.
Stars: ✭ 17 (-94.99%)
Mutual labels:  task-runner
biu
The Command-line Task Hub.
Stars: ✭ 13 (-96.17%)
Mutual labels:  task-runner
makesure
Simple task/command runner with declarative goals and dependencies
Stars: ✭ 230 (-32.15%)
Mutual labels:  task-runner
Phulp
The task manager for php
Stars: ✭ 294 (-13.27%)
Mutual labels:  task-runner
simple-task-queue
asynchronous task queues using python's multiprocessing library
Stars: ✭ 39 (-88.5%)
Mutual labels:  task-queue
Dingo
An easy-to-use, distributed, extensible task/job queue framework for #golang
Stars: ✭ 289 (-14.75%)
Mutual labels:  task-queue
reflow
A light-weight lock-free series/parallel combined scheduling framework for tasks. The goal is to maximize parallelism in order to minimize the execution time overall.
Stars: ✭ 23 (-93.22%)
Mutual labels:  task-runner
lets
CLI task runner for developers - a better alternative to make
Stars: ✭ 50 (-85.25%)
Mutual labels:  task-runner
procrastinate
PostgreSQL-based Task Queue for Python
Stars: ✭ 322 (-5.01%)
Mutual labels:  task-queue
alfred
(v0.2) Even Batman needs a little help. Task runner. Automator. Build system.
Stars: ✭ 62 (-81.71%)
Mutual labels:  task-runner
Air
☁️ Live reload for Go apps
Stars: ✭ 5,257 (+1450.74%)
Mutual labels:  task-runner
distex
Distributed process pool for Python
Stars: ✭ 101 (-70.21%)
Mutual labels:  task-queue
Hela
🍦 Powerful software development experience and management. Enhancing @tc39 JS, @denoland and @nodejs, because we need a bit of magic. ✨ You can think of it as Cargo for the JavaScript ecosystem.
Stars: ✭ 320 (-5.6%)
Mutual labels:  task-runner
Celery
Distributed Task Queue (development branch)
Stars: ✭ 18,378 (+5321.24%)
Mutual labels:  task-runner
Task
A task runner / simpler Make alternative written in Go
Stars: ✭ 4,282 (+1163.13%)
Mutual labels:  task-runner
Dobi
A build automation tool for Docker applications
Stars: ✭ 269 (-20.65%)
Mutual labels:  task-runner

yotaq - Your Own Task Queue for Python

So you need a task queue for your Python project. Sure you could check celery, and after three months trying to understand the basic configuration options you'll be good to go. Or you could use a simpler task queue like huey or rq.

Why don't you try building your own task queue? Well, now you can!

First, we'll use redis as our message broker. There's no need to install redis, we'll use docker so we keep our environment clean. Open a terminal and run:

docker run -p 6379:6379 redis

There you go. Now let's create a Python virtual environment to handle our dependencies, which are the redis python library and dill:

virtualenv env
source env/bin/activate
pip install redis dill

Pretty good. Our python code will use dill to serialize the functions to be run and redis to store the tasks.

The client

The client will issue the tasks to be enqueued, so open up an editor, create a file called client.py. There, we'll define the task that will be sent to the workers, for example:

import random
import time
import logging

logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

def do_something(arg1, arg2):
    """ Dummy function that just waits a random amount of time """

    logger.info("Performing task with arg1=%s and arg2=%s", arg1, arg2)
    time.sleep(random.uniform(0.0, 1))

Now we need to configure our redis client:

import redis

r = redis.Redis(
    host='localhost',
    port=6379
)

Once that's done, we're ready to generate and enqueue some tasks:

import dill

# Generate N tasks
NUM_TASKS = 100
logger.info("Generating %i tasks", NUM_TASKS)

for i in range(NUM_TASKS):

    # Generate two random arguments                                                                                       
    a1 = random.randrange(0, 100)
    a2 = random.randrange(0, 100)

    # Serialize the task and its arguments                                                                                
    data = dill.dumps((do_something, [a1, a2]))

    # Store it in the message broker                                                                                      
    r.lpush('tasks', data)

The worker

The worker will do the work (who would've guessed?) by keeping an eye on the task queue and fetching the available tasks to run. Pretty simple. So open up an editor to create our worker.py file and write the following:

# Configure our redis client 
r = redis.Redis(
    host='localhost',
    port=6379
)

while True:
    # Wait until there's an element in the 'tasks' queue
    key, data = r.brpop('tasks')

    # Deserialize the task
    d_fun, d_args = dill.loads(data)

    # Run the task
    d_fun(*d_args)

Boom! You're done! Run some workers with:

python worker.py

You can even run them in other machines, such scaling, very distributed. And then run the client to create some tasks.

python client.py

How's that for less than 50 lines of code?

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].