All Projects → elementar → Docker S3 Volume

elementar / Docker S3 Volume

Licence: mit
Docker container with a data volume from s3.

Programming Languages

shell
77523 projects

Projects that are alternatives of or similar to Docker S3 Volume

docker-aws-s3-sync
Docker container to sync a folder to Amazon S3
Stars: ✭ 21 (-87.35%)
Mutual labels:  sync, backup, s3
mindav
A self-hosted file backup server which bridges WebDAV protocol with @minio written in @totoval. Webdav ❤️ Minio
Stars: ✭ 64 (-61.45%)
Mutual labels:  sync, backup, s3
Drone Cache
A Drone plugin for caching current workspace files between builds to reduce your build times
Stars: ✭ 194 (+16.87%)
Mutual labels:  aws, s3, volume
docker base images
Vlad's Base Images for Docker
Stars: ✭ 61 (-63.25%)
Mutual labels:  sync, backup, s3
Deploy Strapi On Aws
Deploying a Strapi API on AWS (EC2 & RDS & S3)
Stars: ✭ 121 (-27.11%)
Mutual labels:  aws, s3
Node Acme Lambda
Use AWS Lambda to manage SSL certificates for ACME providers like Let's Encrypt.
Stars: ✭ 120 (-27.71%)
Mutual labels:  aws, s3
Dynein
DynamoDB CLI written in Rust.
Stars: ✭ 126 (-24.1%)
Mutual labels:  aws, backup
Inertia
✈️ Effortless, self-hosted continuous deployment for small teams and projects
Stars: ✭ 133 (-19.88%)
Mutual labels:  aws, docker-compose
Amazon S3 Find And Forget
Amazon S3 Find and Forget is a solution to handle data erasure requests from data lakes stored on Amazon S3, for example, pursuant to the European General Data Protection Regulation (GDPR)
Stars: ✭ 115 (-30.72%)
Mutual labels:  aws, s3
Serverless S3 Sync
A plugin to sync local directories and S3 prefixes for Serverless Framework ⚡️
Stars: ✭ 126 (-24.1%)
Mutual labels:  aws, s3
Wal G
Archival and Restoration for Postgres
Stars: ✭ 1,974 (+1089.16%)
Mutual labels:  s3, backup
Serverless Architectures Aws
The code repository for the Serverless Architectures on AWS book
Stars: ✭ 120 (-27.71%)
Mutual labels:  aws, s3
Website
⚡️ Instantly deploy static website on serverless infrastructure with zero configuration using Serverless Components.
Stars: ✭ 118 (-28.92%)
Mutual labels:  aws, s3
Cash
HTTP response caching for Koa. Supports Redis, in-memory store, and more!
Stars: ✭ 122 (-26.51%)
Mutual labels:  aws, s3
Aws Lambda Blog
AWS Lambda serverless blogging platform
Stars: ✭ 119 (-28.31%)
Mutual labels:  aws, s3
Terraform Aws S3 Bucket
Terraform module which creates S3 bucket resources on AWS
Stars: ✭ 130 (-21.69%)
Mutual labels:  aws, s3
Backup And Bcp For Aws
A collection of scripts & tooling that's executed from Lambda to backup your AWS Services such as Route53, EBS, RDS, EFS, etc into a S3 bucket allowing you to sync into Google Cloud for Business Continuity.
Stars: ✭ 151 (-9.04%)
Mutual labels:  aws, backup
Cognito Backup Restore
AIO Tool for backing up and restoring AWS Cognito User Pools
Stars: ✭ 142 (-14.46%)
Mutual labels:  aws, backup
0x4447 product s3 email
📫 A serverless email server on AWS using S3 and SES
Stars: ✭ 2,905 (+1650%)
Mutual labels:  aws, s3
Aws Sdk Perl
A community AWS SDK for Perl Programmers
Stars: ✭ 153 (-7.83%)
Mutual labels:  aws, s3

docker-s3-volume

Docker Build Status Docker Layers Count Docker Version Docker Pull Count Docker Stars

Creates a Docker container that is restored and backed up to a directory on s3. You could use this to run short lived processes that work with and persist data to and from S3.

Usage

For the simplest usage, you can just start the data container:

docker run -d --name my-data-container \
           elementar/s3-volume /data s3://mybucket/someprefix

This will download the data from the S3 location you specify into the container's /data directory. When the container shuts down, the data will be synced back to S3.

To use the data from another container, you can use the --volumes-from option:

docker run -it --rm --volumes-from=my-data-container busybox ls -l /data

Configuring a sync interval

When the BACKUP_INTERVAL environment variable is set, a watcher process will sync the /data directory to S3 on the interval you specify. The interval can be specified in seconds, minutes, hours or days (adding s, m, h or d as the suffix):

docker run -d --name my-data-container -e BACKUP_INTERVAL=2m \
           elementar/s3-volume /data s3://mybucket/someprefix

Configuring credentials

If you are running on EC2, IAM role credentials should just work. Otherwise, you can supply credential information using environment variables:

docker run -d --name my-data-container \
           -e AWS_ACCESS_KEY_ID=... -e AWS_SECRET_ACCESS_KEY=... \
           elementar/s3-volume /data s3://mybucket/someprefix

Any environment variable available to the aws-cli command can be used. see http://docs.aws.amazon.com/cli/latest/userguide/cli-environment.html for more information.

Configuring an endpoint URL

If you are using an S3-compatible service (such as Oracle OCI Object Storage), you may want to set the service's endpoint URL:

docker run -d --name my-data-container -e ENDPOINT_URL=... \
           elementar/s3-volume /data s3://mybucket/someprefix

Forcing a sync

A final sync will always be performed on container shutdown. A sync can be forced by sending the container the USR1 signal:

docker kill --signal=USR1 my-data-container

Forcing a restoration

The first time the container is ran, it will fetch the contents of the S3 location to initialize the /data directory. If you want to force an initial sync again, you can run the container again with the --force-restore option:

docker run -d --name my-data-container \
           elementar/s3-volume --force-restore /data s3://mybucket/someprefix

Deletion and sync

By default if there are files that are deleted in your local file system, those will be deleted remotely. If you wish to turn this off, set the environment variable S3_SYNC_FLAGS to an empty string:

docker run -d -e S3_SYNC_FLAGS="" elementar/s3-volume /data s3://mybucket/someprefix

Using Compose and named volumes

Most of the time, you will use this image to sync data for another container. You can use docker-compose for that:

# docker-compose.yaml
version: "2"

volumes:
  s3data:
    driver: local

services:
  s3vol:
    image: elementar/s3-volume
    command: /data s3://mybucket/someprefix
    volumes:
      - s3data:/data
  db:
    image: postgres
    volumes:
      - s3data:/var/lib/postgresql/data

Contributing

  1. Fork it!
  2. Create your feature branch: git checkout -b my-new-feature
  3. Commit your changes: git commit -am 'Add some feature'
  4. Push to the branch: git push origin my-new-feature
  5. Submit a pull request :D

Credits

  • Original Developer - Dave Newman (@whatupdave)
  • Current Maintainer - Fábio Batista (@fabiob)

License

This repository is released under the MIT license:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].