All Projects → weseek → mongodb-awesome-backup

weseek / mongodb-awesome-backup

Licence: other
Awesome backup tools for MongoDB w/ docker integration

Programming Languages

shell
77523 projects
Dockerfile
14818 projects

Projects that are alternatives of or similar to mongodb-awesome-backup

Cloud-Service-Providers-Free-Tier-Overview
Comparing the free tier offers of the major cloud providers like AWS, Azure, GCP, Oracle etc.
Stars: ✭ 226 (+927.27%)
Mutual labels:  gcp
etlflow
EtlFlow is an ecosystem of functional libraries in Scala based on ZIO for writing various different tasks, jobs on GCP and AWS.
Stars: ✭ 38 (+72.73%)
Mutual labels:  gcp
SimpleCSPM
GCP CSPM using Google Sheets
Stars: ✭ 24 (+9.09%)
Mutual labels:  gcp
nodecloud-legacy
[DEPERECATED] REFER: nodecloud | The Node.js API for open cloud
Stars: ✭ 55 (+150%)
Mutual labels:  gcp
deep autoviml
Build tensorflow keras model pipelines in a single line of code. Now with mlflow tracking. Created by Ram Seshadri. Collaborators welcome. Permission granted upon request.
Stars: ✭ 98 (+345.45%)
Mutual labels:  gcp
cloudgamestream
A Powershell one-click solution to enable NVIDIA GeForce Experience GameStream on a cloud machine with a GRID supporting GPU.
Stars: ✭ 99 (+350%)
Mutual labels:  gcp
Liquid-Application-Framework
Liquid Application Framework documentation, useful links and sample project
Stars: ✭ 467 (+2022.73%)
Mutual labels:  gcp
30Days-of-GCP
Resources for the 30 Days of GCP program
Stars: ✭ 26 (+18.18%)
Mutual labels:  gcp
dca-prep-kit
Preparation notes and tips & tricks for cloud an IT certifications
Stars: ✭ 41 (+86.36%)
Mutual labels:  gcp
terraform-splunk-log-export
Deploy Google Cloud log export to Splunk using Terraform
Stars: ✭ 26 (+18.18%)
Mutual labels:  gcp
gtoken
Securely access AWS services from GKE cluster
Stars: ✭ 43 (+95.45%)
Mutual labels:  gcp
kube-secrets-init
Kubernetes mutating webhook for `secrets-init` injection
Stars: ✭ 106 (+381.82%)
Mutual labels:  gcp
argon
Campaign Manager 360 and Display & Video 360 Reports to BigQuery connector
Stars: ✭ 31 (+40.91%)
Mutual labels:  gcp
google-managed-certs-gke
DEPRECATED: How to use Google Managed SSL Certificates on GKE
Stars: ✭ 16 (-27.27%)
Mutual labels:  gcp
terraformit-gcp
Generating tf files and tfstate from existing GCP resources.
Stars: ✭ 48 (+118.18%)
Mutual labels:  gcp
kane
Google Pub/Sub client for Elixir
Stars: ✭ 92 (+318.18%)
Mutual labels:  gcp
terraform-gcp-labs
Terraform templates for GCP provider ☁️
Stars: ✭ 27 (+22.73%)
Mutual labels:  gcp
terraforming-gcp
use terraform, deploy yourself a pcf
Stars: ✭ 73 (+231.82%)
Mutual labels:  gcp
serverless-ktp-ocr
Serverless Indonesian Identity E-KTP OCR with Google Cloud Platform (GCP) - Cloud Functions, Cloud Storage, and Cloud PubSub
Stars: ✭ 54 (+145.45%)
Mutual labels:  gcp
gcp-get-secret
A simple command line utility to get secrets from the Google Secret Manager into your environment
Stars: ✭ 35 (+59.09%)
Mutual labels:  gcp

Test

What is mongodb-awesome-backup?

mongodb-awesome-backup is the collection of scripts which backup MongoDB databases to Amazon S3 or Google Cloud Storage. You can set a custom S3 endpoint to use S3 based services like DigitalOcean Spaces instead of Amazon S3.

Requirements

  • Amazon IAM Access Key ID/Secret Access Key
    • which must have the access lights of the target Amazon S3 bucket.

OR

  • Google Cloud Interoperable storage access keys (see https://cloud.google.com/storage/docs/migrating#keys)
    • GCP_SERVICE_ACCOUNT_KEY_JSON_PATH and GCP_PROJECT_ID are only required if using service account authentication.
    • GCP_ACCESS_KEY_ID, GCP_SECRET_ACCESS_KEY, and GCP_PROJECT_ID are only required if using HMAC authentication.
    • When using oauth authentication, a docker mount -v ~:/mab and is the can be added to save auth0 credentials to your home directory after mongodb-awesome-backup is run. On subsequent runs, the same ~/.boto file will be used for authentication.
    • The name 'mab' was chosen as the Docker container mount point simply because it's an acronym for "mongodb-awesome-backup"). The /mab mount point maps to the home directory of whatever user is used to run mongodb-awesome-backup, and is where the .boto file will be saved.

Usage

Note that either AWS_ or GCP_ vars are required not both.

docker run --rm \
  -e AWS_ACCESS_KEY_ID=<Your IAM Access Key ID> \
  -e AWS_SECRET_ACCESS_KEY=<Your IAM Secret Access Key> \
  [ -e GCP_SERVICE_ACCOUNT_KEY_JSON_PATH=<JSON file path to your GCP Service Account Key> \ ]
  [ -e GCP_ACCESS_KEY_ID=<Your GCP Access Key> \ ]
  [ -e GCP_SECRET_ACCESS_KEY=<Your GCP Secret> \ ]
  [ -e GCP_PROJECT_ID=<Your GCP Project ID> \ ]
  -e TARGET_BUCKET_URL=<Target Bucket URL ([s3://...|gs://...])> \
  [ -e BACKUPFILE_PREFIX=<Prefix of Backup Filename (default: "backup") \ ]
  [ -e MONGODB_URI=<Target MongoDB URI> \ ]
  [ -e MONGODB_HOST=<Target MongoDB Host (default: "mongo")> \ ]
  [ -e MONGODB_DBNAME=<Target DB name> \ ]
  [ -e MONGODB_USERNAME=<DB login username> \ ]
  [ -e MONGODB_PASSWORD=<DB login password> \ ]
  [ -e MONGODB_AUTHDB=<Authentication DB name> \ ]
  [ -e AWSCLI_ENDPOINT_OPT=<S3 endpoint URL (ex. https://fra1.digitaloceanspaces.com)> \ ]
  [ -v ~:/mab \ ]
  weseek/mongodb-awesome-backup

and after running this, backup-YYYYMMdd.tar.bz2 will be placed on Target S3 Bucket.

How to backup in cron mode

Execute a docker container with CRONMODE=true.

docker run --rm \
  -e AWS_ACCESS_KEY_ID=<Your IAM Access Key ID> \
  -e AWS_SECRET_ACCESS_KEY=<Your IAM Secret Access Key> \
  [ -e GCP_SERVICE_ACCOUNT_KEY_JSON_PATH=<JSON file path to your GCP Service Account Key> \ ]
  [ -e GCP_ACCESS_KEY_ID=<Your GCP Access Key> \ ]
  [ -e GCP_SECRET_ACCESS_KEY=<Your GCP Secret> \ ]
  [ -e GCP_PROJECT_ID=<Your GCP Project ID> \ ]
  -e TARGET_BUCKET_URL=<Target Bucket URL ([s3://...|gs://...])> \
  -e CRONMODE=true \
  -e CRON_EXPRESSION=<Cron expression (ex. "CRON_EXPRESSION='0 4 * * *'" if you want to run at 4:00 every day)> \
  [ -e BACKUPFILE_PREFIX=<Prefix of Backup Filename (default: "backup") \ ]
  [ -e MONGODB_URI=<Target MongoDB URI> \ ]
  [ -e MONGODB_HOST=<Target MongoDB Host (default: "mongo")> \ ]
  [ -e MONGODB_DBNAME=<Target DB name> \ ]
  [ -e MONGODB_USERNAME=<DB login username> \ ]
  [ -e MONGODB_PASSWORD=<DB login password> \ ]
  [ -e MONGODB_AUTHDB=<Authentication DB name> \ ]
  [ -e AWSCLI_ENDPOINT_OPT=<S3 endpoint URL (ex. https://fra1.digitaloceanspaces.com)> \ ]
  [ -v ~:/mab \ ]
  weseek/mongodb-awesome-backup

How to restore

You can use "restore" command to restore database from backup file.

docker run --rm \
  -e AWS_ACCESS_KEY_ID=<Your IAM Access Key ID> \
  -e AWS_SECRET_ACCESS_KEY=<Your IAM Secret Access Key> \
  [ -e GCP_SERVICE_ACCOUNT_KEY_JSON_PATH=<JSON file path to your GCP Service Account Key> \ ]
  [ -e GCP_ACCESS_KEY_ID=<Your GCP Access Key> \ ]
  [ -e GCP_SECRET_ACCESS_KEY=<Your GCP Secret> \ ]
  [ -e GCP_PROJECT_ID=<Your GCP Project ID> \ ]
  -e TARGET_BUCKET_URL=<Target Bucket URL ([s3://...|gs://...])> \
  -e TARGET_FILE=<Target S3 or GS file name to restore> \
  [ -e MONGODB_URI=<Target MongoDB URI> \ ]
  [ -e MONGODB_HOST=<Target MongoDB Host (default: "mongo")> \ ]
  [ -e MONGODB_DBNAME=<Target DB name> \ ]
  [ -e MONGODB_USERNAME=<DB login username> \ ]
  [ -e MONGODB_PASSWORD=<DB login password> \ ]
  [ -e MONGODB_AUTHDB=<Authentication DB name> \ ]
  [ -e MONGORESTORE_OPTS=<Options list of mongorestore> \ ]
  [ -e AWSCLI_ENDPOINT_OPT=<S3 endpoint URL (ex. https://fra1.digitaloceanspaces.com)> \ ]
  [ -v ~:/mab \ ]
  weseek/mongodb-awesome-backup restore

Environment variables

For backup, prune, list

Required

Variable Description Default
AWS_ACCESS_KEY_ID Your IAM Access Key ID -
AWS_SECRET_ACCESS_KEY Your IAM Secret Access Key -
TARGET_BUCKET_URL Target Bucket URL ([s3://...|gs://...]). URL is needed to be end with '/' -

Optional

Variable Description Default
GCP_SERVICE_ACCOUNT_KEY_JSON_PATH JSON file path to your GCP Service Account Key -
GCP_ACCESS_KEY_ID Your GCP Access Key -
GCP_SECRET_ACCESS_KEY Your GCP Secret -
GCP_PROJECT_ID Your GCP Project ID -
BACKUPFILE_PREFIX Prefix of Backup Filename "backup"
MONGODB_URI Target MongoDB URI (ex. mongodb://mongodb?replicaSet=rs0). If set, the other MONGODB_* variables will be ignored. -
MONGODB_HOST Target MongoDB Host "mongo"
MONGODB_DBNAME Target DB name -
MONGODB_USERNAME DB login username -
MONGODB_PASSWORD DB login password -
MONGODB_AUTHDB Authentication DB name -
CRONMODE If set "true", this container is executed in cron mode. In cron mode, the script will be executed with the specified arguments and at the time specified by CRON_EXPRESSION. "false"
CRON_EXPRESSION Cron expression (ex. "CRON_EXPRESSION=0 4 * * *" if you want to run at 4:00 every day) -
AWSCLI_ENDPOINT_OPT Set a custom S3 endpoint if you use a S3 based service like DigitalOcean Spaces. (ex. AWSCLI_ENDPOINT_OPT="https://fra1.digitaloceanspaces.com") If not set the Amazon S3 standard endpoint will be used. -
AWSCLIOPT Other options you want to pass to aws command -
GCSCLIOPT Other options you want to pass to gsutil command -
HEALTHCHECKS_URL URL that gets called after a successful backup (eg. https://healthchecks.io) -

For restore

Required

Variable Description Default
AWS_ACCESS_KEY_ID Your IAM Access Key ID -
AWS_SECRET_ACCESS_KEY Your IAM Secret Access Key -
TARGET_BUCKET_URL Target Bucket URL ([s3://...|gs://...]). URL is needed to be end with '/' -
TARGET_FILE Target S3 or GS file name to restore -

Optional

Variable Description Default
GCP_SERVICE_ACCOUNT_KEY_JSON_PATH JSON file path to your GCP Service Account Key -
GCP_ACCESS_KEY_ID Your GCP Access Key -
GCP_SECRET_ACCESS_KEY Your GCP Secret -
GCP_PROJECT_ID Your GCP Project ID -
MONGODB_URI Target MongoDB URI (ex. mongodb://mongodb?replicaSet=rs0). If set, the other MONGODB_* variables will be ignored. -
MONGODB_HOST Target MongoDB Host "mongo"
MONGODB_DBNAME DB name to be restored from backup -
MONGODB_USERNAME DB login username -
MONGODB_PASSWORD DB login password -
MONGODB_AUTHDB Authentication DB name -
MONGORESTORE_OPTS Options list of mongorestore. (ex --drop) -
AWSCLI_ENDPOINT_OPT Set a custom S3 endpoint if you use a S3 based service like DigitalOcean Spaces. (ex. AWSCLI_ENDPOINT_OPT="https://fra1.digitaloceanspaces.com") If not set the Amazon S3 standard endpoint will be used. -
AWSCLIOPT Other options you want to pass to aws command -
GCSCLIOPT Other options you want to pass to gsutil command -
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].