All Projects → nicor88 → Aws Ecs Airflow

nicor88 / Aws Ecs Airflow

Licence: mit
Run Airflow in AWS ECS(Elastic Container Service) using Fargate tasks

Projects that are alternatives of or similar to Aws Ecs Airflow

Terraform Aws Alb
Terraform module to provision a standard ALB for HTTP/HTTP traffic
Stars: ✭ 53 (-50.47%)
Mutual labels:  aws, ecs, terraform, hcl
Terraform Aws Airflow
Terraform module to deploy an Apache Airflow cluster on AWS, backed by RDS PostgreSQL for metadata, S3 for logs and SQS as message broker with CeleryExecutor
Stars: ✭ 69 (-35.51%)
Mutual labels:  aws, terraform, hcl, airflow
Terraform Fargate Example
Example repository to run an ECS cluster on Fargate
Stars: ✭ 206 (+92.52%)
Mutual labels:  aws, ecs, terraform, hcl
Terraform Aws Ecs Codepipeline
Terraform Module for CI/CD with AWS Code Pipeline and Code Build for ECS https://cloudposse.com/
Stars: ✭ 85 (-20.56%)
Mutual labels:  aws, ecs, terraform, hcl
Infrastructure As Code Talk
Sample code for the talk "Infrastructure-as-code: running microservices on AWS with Docker, ECS, and Terraform"
Stars: ✭ 520 (+385.98%)
Mutual labels:  aws, ecs, terraform, hcl
Terraform Aws Ecs Container Definition
Terraform module to generate well-formed JSON documents (container definitions) that are passed to the aws_ecs_task_definition Terraform resource
Stars: ✭ 217 (+102.8%)
Mutual labels:  aws, ecs, terraform, hcl
Ecs Pipeline
☁️ 🐳 ⚡️ 🚀 Create environment and deployment pipelines to ECS Fargate with CodePipeline, CodeBuild and Github using Terraform
Stars: ✭ 85 (-20.56%)
Mutual labels:  aws, ecs, terraform, hcl
Terraform Ecs Fargate
A Terraform template used for provisioning web application stacks on AWS ECS Fargate
Stars: ✭ 293 (+173.83%)
Mutual labels:  aws, ecs, terraform, hcl
Terraform Aws Ecs Fargate
Terraform module which creates ECS Fargate resources on AWS.
Stars: ✭ 35 (-67.29%)
Mutual labels:  aws, ecs, terraform, hcl
Terraform Ecs Autoscale Alb
ECS cluster with instance and service autoscaling configured and running behind an ALB with path based routing set up
Stars: ✭ 60 (-43.93%)
Mutual labels:  aws, ecs, terraform, hcl
Terraform Aws Vpc Peering
Terraform module to create a peering connection between two VPCs in the same AWS account.
Stars: ✭ 70 (-34.58%)
Mutual labels:  aws, terraform, hcl
Terraform Aws Rabbitmq
Terraform configuration for creating RabbitMQ cluster on AWS.
Stars: ✭ 86 (-19.63%)
Mutual labels:  aws, terraform, hcl
Udacity Data Engineering
Udacity Data Engineering Nano Degree (DEND)
Stars: ✭ 89 (-16.82%)
Mutual labels:  aws, airflow, etl
Terraform Eks
Terraform for AWS EKS
Stars: ✭ 82 (-23.36%)
Mutual labels:  aws, terraform, hcl
Elastic Beanstalk Terraform Setup
🎬 Playbook for setting up & deploying AWS Beanstalk Applications on Docker with 1 command
Stars: ✭ 69 (-35.51%)
Mutual labels:  aws, terraform, hcl
Tf Jitsi
5-minute self-hosted Jitsi on AWS
Stars: ✭ 73 (-31.78%)
Mutual labels:  aws, terraform, hcl
Terraform Aws S3 Log Storage
This module creates an S3 bucket suitable for receiving logs from other AWS services such as S3, CloudFront, and CloudTrail
Stars: ✭ 65 (-39.25%)
Mutual labels:  aws, terraform, hcl
Terraform Aws Wireguard
Terraform module to deploy WireGuard on AWS
Stars: ✭ 72 (-32.71%)
Mutual labels:  aws, terraform, hcl
Mikado
🤖💨Mikado helps managing your AWS infrastructure for WordPress sites by defining an out-of-box, highly available, easy-to-deploy setup
Stars: ✭ 80 (-25.23%)
Mutual labels:  aws, terraform, hcl
Terraform Aws Dynamic Subnets
Terraform module for public and private subnets provisioning in existing VPC
Stars: ✭ 106 (-0.93%)
Mutual labels:  aws, terraform, hcl

airflow-ecs

Setup to run Airflow in AWS ECS containers

Requirements

Local

  • Docker

AWS

  • AWS IAM User for the infrastructure deployment, with admin permissions
  • awscli, intall running pip install awscli
  • terraform >= 0.13
  • setup your IAM User credentials inside ~/.aws/credentials
  • setup these env variables in your .zshrc or .bashrc, or in your the terminal session that you are going to use
    export AWS_ACCOUNT=your_account_id
    export AWS_DEFAULT_REGION=us-east-1 # it's the default region that needs to be setup also in infrastructure/config.tf
    

Local Development

  • Generate a Fernet Key:

    pip install cryptography
    export AIRFLOW_FERNET_KEY=$(python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())")
    

    More about that here

  • Start Airflow locally simply running:

    docker-compose up --build
    

If everything runs correctly you can reach Airflow navigating to localhost:8080. The current setup is based on Celery Workers. You can monitor how many workers are currently active using Flower, visiting localhost:5555

Deploy Airflow on AWS ECS

To run Airflow in AWS we will use ECS (Elastic Container Service).

Deploy Infrastructure using Terraform

Run the following commands:

make infra-init
make infra-plan
make infra-apply

or alternatively

cd infrastructure
terraform get
terraform init -upgrade;
terraform plan
terraform apply

By default the infrastructure is deployed in us-east-1.

When the infrastructure is provisioned (the RDS metadata DB will take a while) check the if the ECR repository is created then run:

bash scripts/push_to_ecr.sh airflow-dev

By default the repo name created with terraform is airflow-dev Without this command the ECS services will fail to fetch the latest image from ECR

Deploy new Airflow application

To deploy an update version of Airflow you need to push a new container image to ECR. You can simply doing that running:

./scripts/deploy.sh airflow-dev

The deployment script will take care of:

  • push a new ECR image to your repository
  • re-deploy the new ECS services with the updated image

TODO

  • Create Private Subnets
  • Move ECS containers to Private Subnets
  • Use ECS private Links for Private Subnets
  • Improve ECS Task and Service Role
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].