All Projects → dusterio → Laravel Aws Worker

dusterio / Laravel Aws Worker

Licence: mit
Run Laravel (or Lumen) tasks and queue listeners inside of AWS Elastic Beanstalk workers

Projects that are alternatives of or similar to Laravel Aws Worker

Laravel Elasticbeanstalk Queue Worker
Stars: ✭ 48 (-82.35%)
Mutual labels:  aws, worker, laravel
Laravel Dynamodb
Eloquent syntax for DynamoDB
Stars: ✭ 342 (+25.74%)
Mutual labels:  aws, laravel
Php Examples For Aws Lambda
Demo serverless applications, examples code snippets and resources for PHP
Stars: ✭ 177 (-34.93%)
Mutual labels:  aws, laravel
Cipi
An Open Source Control Panel for your Cloud! Deploy and manage LEMP apps in one click!
Stars: ✭ 376 (+38.24%)
Mutual labels:  aws, laravel
Simpleue
PHP queue worker and consumer - Ready for AWS SQS, Redis, Beanstalkd and others.
Stars: ✭ 124 (-54.41%)
Mutual labels:  aws, worker
Laravel Plain Sqs
Custom SQS connector for Laravel (or Lumen) that supports third-party, plain JSON messages
Stars: ✭ 91 (-66.54%)
Mutual labels:  aws, laravel
Laravel Aws Sns
Laravel package for the AWS SNS Events
Stars: ✭ 24 (-91.18%)
Mutual labels:  aws, laravel
Sqs Worker Serverless
Example for SQS Worker in AWS Lambda using Serverless
Stars: ✭ 164 (-39.71%)
Mutual labels:  aws, worker
Laravel Aws Eb
Ready-to-deploy configuration to run Laravel on AWS Elastic Beanstalk.
Stars: ✭ 247 (-9.19%)
Mutual labels:  aws, laravel
Pixelfed
Photo Sharing. For Everyone.
Stars: ✭ 3,237 (+1090.07%)
Mutual labels:  laravel
Scoutsuite
Multi-Cloud Security Auditing Tool
Stars: ✭ 3,803 (+1298.16%)
Mutual labels:  aws
Cfripper
Library and CLI tool for analysing CloudFormation templates and check them for security compliance.
Stars: ✭ 265 (-2.57%)
Mutual labels:  aws
Ray
Debug with Ray to fix problems faster
Stars: ✭ 263 (-3.31%)
Mutual labels:  laravel
Eloquent Power Joins
The Laravel magic you know, now applied to joins.
Stars: ✭ 264 (-2.94%)
Mutual labels:  laravel
Cors
🔮Supported(Laravel/Lumen/PSR-15/Swoft/Slim/ThinkPHP) - PHP CORS (Cross-origin resource sharing) middleware.
Stars: ✭ 266 (-2.21%)
Mutual labels:  laravel
Aws Serverless Samfarm
This repo is full CI/CD Serverless example which was used in the What's New with AWS Lambda presentation at Re:Invent 2016.
Stars: ✭ 271 (-0.37%)
Mutual labels:  aws
Eloquent Builder
Provides an advanced filter for Laravel or Lumen model.
Stars: ✭ 264 (-2.94%)
Mutual labels:  laravel
Html
Laravel package designed to generate common HTML components
Stars: ✭ 265 (-2.57%)
Mutual labels:  laravel
Alfred Laravel Docs
An ultra-fast Laravel docs search workflow for Alfred 3+.
Stars: ✭ 270 (-0.74%)
Mutual labels:  laravel
Serverless Architecture Boilerplate
📦 ⚡️ 🚀 Boilerplate to organize and deploy big projects using AWS API Gateway and AWS Lambda with Serverless Framework
Stars: ✭ 269 (-1.1%)
Mutual labels:  aws

laravel-aws-worker

Build Status Code Climate Total Downloads Latest Stable Version Latest Unstable Version License

Run Laravel (or Lumen) tasks and queue listeners inside of AWS Elastic Beanstalk workers

Overview

Laravel documentation recommends to use supervisor for queue workers and *IX cron for scheduled tasks. However, when deploying your application to AWS Elastic Beanstalk, neither option is available.

This package helps you run your Laravel (or Lumen) jobs in AWS worker environments.

Standard Laravel queue flow AWS Elastic Beanstalk flow

Dependencies

  • PHP >= 5.5
  • Laravel (or Lumen) >= 5.1

Scheduled tasks - option 1

Option one is to use Kernel.php as the schedule and run Laravel schedule runner every minute. You remember how Laravel documentation advised you to invoke the task scheduler? Right, by running php artisan schedule:run on regular basis, and to do that we had to add an entry to our cron file:

* * * * * php /path/to/artisan schedule:run >> /dev/null 2>&1

AWS doesn't allow you to run *IX commands or to add cron tasks directly. Instead, you have to make regular HTTP (POST, to be precise) requests to your worker endpoint.

Add cron.yaml to the root folder of your application (this can be a part of your repo or you could add this file right before deploying to EB - the important thing is that this file is present at the time of deployment):

version: 1
cron:
 - name: "schedule"
   url: "/worker/schedule"
   schedule: "* * * * *"

From now on, AWS will do POST /worker/schedule to your endpoint every minute - kind of the same effect we achieved when editing a UNIX cron file. The important difference here is that the worker environment still has to run a web process in order to execute scheduled tasks. Behind the scenes it will do something very similar to a built-in schedule:run command.

Your scheduled tasks should be defined in App\Console\Kernel::class - just where they normally live in Laravel, eg.:

protected function schedule(Schedule $schedule)
{
    $schedule->command('inspire')
              ->everyMinute();
}

Scheduled tasks - option 2

Option two is to use AWS schedule defined in the cron.yml:

version: 1
cron:
 - name: "run:command"
   url: "/worker/schedule"
   schedule: "0 * * * *"

 - name: "do:something --param=1 -v"
   url: "/worker/schedule"
   schedule: "*/5 * * * *"

Note that AWS will use UTC timezone for cron expressions. With the above example, AWS will hit /worker/schedule endpoint every hour with run:command artisan command and every 5 minutes with do:something command. Command parameters aren't supported at this stage.

Pick whichever option is better for you!

Queued jobs: SQS

Normally Laravel has to poll SQS for new messages, but in case of AWS Elastic Beanstalk messages will come to us – inside of POST requests from the AWS daemon.

Therefore, we will create jobs manually based on SQS payload that arrived, and pass that job to the framework's default worker. From this point, the job will be processed the way it's normally processed in Laravel. If it's processed successfully, our controller will return a 200 HTTP status and AWS daemon will delete the job from the queue. Again, we don't need to poll for jobs and we don't need to delete jobs - that's done by AWS in this case.

If you dispatch jobs from another instance of Laravel or if you are following Laravel's payload format {"job":"","data":""} you should be okay to go. If you want to receive custom format JSON messages, you may want to install Laravel plain SQS package as well.

Configuring the queue

Every time you create a worker environment in AWS, you are forced to choose two SQS queues – either automatically generated ones or some of your existing queues. One of the queues will be for the jobs themselves, another one is for failed jobs – AWS calls this queue a dead letter queue.

You can set your worker queues either during the environment launch or anytime later in the settings:

AWS Worker queue settings

Don't forget to set the HTTP path to /worker/queue – this is where AWS will hit our application. If you chose to generate queues automatically, you can see their details later in SQS section of the AWS console:

AWS SQS details

You have to tell Laravel about this queue. First set your queue driver to SQS in .env file:

QUEUE_DRIVER=sqs

Then go to config/queue.php and copy/paste details from AWS console:

        ...
        'sqs' => [
            'driver' => 'sqs',
            'key' => 'your-public-key',
            'secret' => 'your-secret-key',
            'prefix' => 'https://sqs.us-east-1.amazonaws.com/your-account-id',
            'queue' => 'your-queue-name',
            'region' => 'us-east-1',
        ],
        ...

To generate key and secret go to Identity and Access Management in the AWS console. It's better to create a separate user that ONLY has access to SQS.

Installation via Composer

To install simply run:

composer require dusterio/laravel-aws-worker

Or add it to composer.json manually:

{
    "require": {
        "dusterio/laravel-aws-worker": "~0.1"
    }
}

Usage in Laravel 5

// Add in your config/app.php

'providers' => [
    '...',
    'Dusterio\AwsWorker\Integrations\LaravelServiceProvider',
];

After adding service provider, you should be able to see two special routes that we added:

$ php artisan route:list
+--------+----------+-----------------+------+----------------------------------------------------------+------------+
| Domain | Method   | URI             | Name | Action                                                   | Middleware |
+--------+----------+-----------------+------+----------------------------------------------------------+------------+
|        | POST     | worker/queue    |      | Dusterio\AwsWorker\Controllers\W[email protected]    |            |
|        | POST     | worker/schedule |      | Dusterio\AwsWorker\Controllers\W[email protected] |            |
+--------+----------+-----------------+------+----------------------------------------------------------+------------+

Environment variable REGISTER_WORKER_ROUTES is used to trigger binding of the two routes above. If you run the same application in both web and worker environments, don't forget to set REGISTER_WORKER_ROUTES to false in your web environment. You don't want your regular users to be able to invoke scheduler or queue worker.

This variable is set to true by default at this moment.

So that's it - if you (or AWS) hits /worker/queue, Laravel will process one queue item (supplied in the POST). And if you hit /worker/schedule, we will run the scheduler (it's the same as to run php artisan schedule:run in shell).

Usage in Lumen 5

// Add in your bootstrap/app.php
$app->register(Dusterio\AwsWorker\Integrations\LumenServiceProvider::class);

Errors and exceptions

Please make sure that two special routes are not mounted behind a CSRF middleware. Our POSTs are not real web forms and CSRF is not necessary here. If you have a global CSRF middleware, add these routes to exceptions, or otherwise apply CSRF to specific routes or route groups.

If your job fails, we will throw a FailedJobException. If you want to customize error output – just customise your exception handler. Note that your HTTP status code must be different from 200 in order for AWS to realize the job has failed.

ToDo

  1. Add support for AWS dead letter queue (retry jobs from that queue?)

Video tutorials

I've just started a educational YouTube channel that will cover top IT trends in software development and DevOps: config.sys

Also I'm glad to announce a new cool tool of mine – GrammarCI, an automated typo/grammar checker for developers, as a part of the CI/CD pipeline.

Implications

Note that AWS cron doesn't promise 100% time accuracy. Since cron tasks share the same queue with other jobs, your scheduled tasks may be processed later than expected.

Post scriptum

I wrote a blog post explaining how this actually works.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].