All Projects → heyman → postgresql-backup

heyman / postgresql-backup

Licence: other
Docker image that periodically dumps a Postgres database, and uploads it to an Amazon S3 bucket.

Programming Languages

python
139335 projects - #7 most used programming language
Dockerfile
14818 projects
shell
77523 projects
Makefile
30231 projects

Projects that are alternatives of or similar to postgresql-backup

iceshelf
A simple tool to allow storage of signed, encrypted, incremental backups using Amazon's Glacier storage
Stars: ✭ 28 (-6.67%)
Mutual labels:  backup
paperback
Paper backup generator suitable for long-term storage.
Stars: ✭ 517 (+1623.33%)
Mutual labels:  backup
github-backup-docker
Docker wrapper for github-backup
Stars: ✭ 38 (+26.67%)
Mutual labels:  backup
radio
Redundant Array of Distributed Independent Objectstores in short RADIO performs synchronous mirroring, erasure coding across multiple object stores
Stars: ✭ 25 (-16.67%)
Mutual labels:  backup
trellis-backup-during-deploy
Backup WordPress database during Trellis deploys
Stars: ✭ 23 (-23.33%)
Mutual labels:  backup
docker base images
Vlad's Base Images for Docker
Stars: ✭ 61 (+103.33%)
Mutual labels:  backup
eXperDB-Management
eXperDB-Management is a integrated management tool for PostgreSQL(for efficient operation and management).
Stars: ✭ 38 (+26.67%)
Mutual labels:  backup
simple-backup
A simple mysql database backup library for php.
Stars: ✭ 38 (+26.67%)
Mutual labels:  backup
compose-dump
Dump and restore Docker Compose-projects
Stars: ✭ 14 (-53.33%)
Mutual labels:  backup
dijnet-bot
Az összes számlád még egy helyen :)
Stars: ✭ 17 (-43.33%)
Mutual labels:  backup
couchbackup
CouchDB backup and restore command-line utility.
Stars: ✭ 15 (-50%)
Mutual labels:  backup
ioBroker.backitup
Backitup enables the cyclical creation of backups of an IoBroker / Homematic installation
Stars: ✭ 43 (+43.33%)
Mutual labels:  backup
docker-backup-to-s3
Docker container that periodically backups files to Amazon S3 using s3cmd and cron
Stars: ✭ 124 (+313.33%)
Mutual labels:  backup
joplin-scripts
scripts for Joplin
Stars: ✭ 40 (+33.33%)
Mutual labels:  backup
BM
The Utility to Install Songs, Install Mods, Install/Update BMBF, Install HitSounds, download automatically made Playlists, get better support, switch between the modded and unmodded Version of Beat Saber, do full Backups and way more
Stars: ✭ 33 (+10%)
Mutual labels:  backup
yii2-db-manager
Database Backup and Restore functionality
Stars: ✭ 96 (+220%)
Mutual labels:  backup
myhoard
MySQL Backup and Point-in-time Recovery service
Stars: ✭ 62 (+106.67%)
Mutual labels:  backup
aws-backup-automation
AWS CloudFormation templates and Python code for AWS blog post on how to automate centralized backup at scale across AWS services using AWS Backup.
Stars: ✭ 12 (-60%)
Mutual labels:  backup
zzmysqldump
mysqldump every DB of your MySQL Server to its own, 7z-compressed file.
Stars: ✭ 47 (+56.67%)
Mutual labels:  backup
abgleich
zfs sync tool
Stars: ✭ 22 (-26.67%)
Mutual labels:  backup

Docker PostgreSQL Backup

Build Status

Docker image that periodically dumps a Postgres database, and uploads it to an Amazon S3 bucket.

Required environment variables

  • CRON_SCHEDULE: The time schedule part of a crontab file (e.g: 15 3 * * * for every night 03:15)
  • DB_HOST: Postgres hostname
  • DB_PASS: Postgres password
  • DB_USER: Postgres username
  • DB_NAME: Name of database
  • S3_PATH: Amazon S3 path in the format: s3://bucket-name/some/path
  • AWS_ACCESS_KEY_ID
  • AWS_SECRET_ACCESS_KEY
  • AWS_DEFAULT_REGION

Optional environment variables

  • S3_STORAGE_CLASS: Specify storage class for the uploaded object, defaults to STANDARD_IA.
  • S3_EXTRA_OPTIONS: Specify additional options for S3, e.g. --endpoint= for using custom S3 provider.
  • DB_USE_ENV: Inject postgres environment variables from the environment. Ignores DB_HOST, DB_PASS, DB_USER and DB_NAME. Can be used to specify advanced connections, e.g. using mTLS connection. Example of --env-file=.env for container:
        DB_USE_ENV=True
        PGSSLMODE=verify-full
        PGSSLROOTCERT=/path/ca.crt
        PGSSLCERT=<path>/user.crt
        PGSSLKEY=<path>/user.key
        PGHOSTADDR=1.2.3.4
        PGHOST=db.domain.com
        PGUSER=myuser
        PGDATABASE=mydb
    
  • MAIL_TO: If MAIL_TO and MAIL_FROM is specified, an e-mail will be sent, using Amazon SES, every time a backup is taken
  • MAIL_FROM
  • WEBHOOK: If specified, an HTTP request will be sent to this URL
  • WEBHOOK_METHOD: By default the webhook's HTTP method is GET, but can be changed using this variable
  • WEBHOOK_CURL_OPTIONS: Add additional headers or other option to curl command calling the webhook. E.g. -H 'Content-type: application/json'
  • WEBHOOK_DATA: Add a body to the webhook being called, unless changed it implies that POST method is used. E.g. {"text":"Backup completed at %(date)s %(time)s!"}
  • KEEP_BACKUP_DAYS: The number of days to keep backups for when pruning old backups. Defaults to 7.
  • FILENAME: String that is passed into strftime() and used as the backup dump's filename. Defaults to $DB_NAME_%Y-%m-%d.

Interpolation

Text in WEBHOOK_DATA is interpolated with variabels %(my_var)s

  • date: Date in yyyy-mm-dd format
  • time: Date in hh:mm:ss format
  • duration: Number of seconds take to backup
  • filename: Name of the file uploaded to S3
  • size: Size of the backup file with suitable suffix, like MB, GB, ...

Example on how to post a Slack message when a backup is complete

  1. Configure a webhook as described in the Slack documentation.
  2. Set WEBHOOK and WEBHOOK_ accodringly:
    WEBHOOK=https://hooks.slack.com/services/.../.../...
    WEBHOOK_METHOD=POST
    WEBHOOK_CURL_OPTIONS=-H 'Content-type: application/json'
    WEBHOOK_DATA={"text":":white_check_mark: Backup completed at %(date)s %(time)s\nDuration: %(duration)s seconds\nUpload: %(filename)s: %(size)s"}
    

Volumes

  • /data/backups - The database is dumped in into this directory

Restoring a backup

This image can also be run as a one off task to restore one of the backups. To do this, we run the container with the command: python -u /backup/restore.py [S3-filename] (S3-filename should only be the name of the file, the directory is set through the S3_PATH env variable).

The following environment variables are required:

  • DB_HOST: Postgres hostname
  • DB_PASS: Postgres password
  • DB_USER: Postgres username
  • DB_NAME: Name of database to import into
  • S3_PATH: Amazon S3 directory path in the format: s3://bucket-name/some/path
  • AWS_ACCESS_KEY_ID
  • AWS_SECRET_ACCESS_KEY
  • AWS_DEFAULT_REGION

Other optional environment variables

  • S3_EXTRA_OPTIONS: Specify additional options for S3, e.g. --endpoint= for using custom S3 provider.
  • DB_USE_ENV: See Optional environment variables above.

Taking a one off backup

To run a one off backup job, e.g. to test that it works when setting it up for the first time, simply start the container with the docker run command set to python -u /backup/backup.py (as well as all the required environment variables set).

Docker tags

This image uses the alpine version(s) of the official postgres image as base image.

The following docker tags are available for this image, and they are based on the corresponding official postgres alpine image:

  • 13, latest
  • 12
  • 11
  • 10
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].