Skip to content

A docker image to automatically perform periodic dumps of a Postgres server to an S3 bucket. Supports encryption, compression, and restoration.

License

Notifications You must be signed in to change notification settings

ferdn4ndo/s3-postgres-backup

Repository files navigation

S3 Postgres Backup

Docker Image Size (tag) Docker Pulls E2E test status GitLeaks test status Grype test status ShellCheck test status Release MIT license

An Alpine-based docker image to automatically perform periodic dumps of a Postgres server to an S3 bucket. Supports encryption, compression, and restoration. Protected against code leakage by GitLeaks and package vulnerabilities by Anchore Grype. CI pipeline with code quality check by Shellcheck and internal E2E Automated Tests (ATs).

Based on the project postgresql-backup-s3 by itbm.

Main features

  • Supports setting up a custom interval for the backup generation;
  • Supports encryption (AES-256-CBC) using an environment variable password;
  • Supports dump file compression (before encryption) using xz with a customizable level;
  • Allows deleting previous backup files older than a customizable interval;
  • Supports backup restoration using the CLI;
  • Can create the dump for only one database or every database in the Postgres server;

Summary

How to Run?

Requirements

To run this service, make sure to comply with the following requirements:

  1. There is an instance of Postgres up and running, from where the data will be exported;

  2. There is an S3 bucket and an IAM user (identified by an ID and Key) to where the backup files will be uploaded;

  3. Docker is installed and running in the host machine;

Preparing the environment

First of all, clone the .env.template file to .env in the project root folder:

cp .env.template .env

Then edit the file accordingly.

Building the image

To build the image (assuming the s3-postgres-backup image name and the latest tag) use the following command in the project root folder:

docker build -f ./Dockerfile --tag s3-postgres-backup:latest

After setting up the environment and building the image, you can now launch a container with it. Considering the 000, use the following command in the project root folder:

docker run --rm -v "./scripts:/backup/scripts" --env-file ./.env --name "s3-postgres-backup" s3-postgres-backup:latest

Running in docker-compose

As this repository has a Docker image available for pulling, we can add this service to an existing stack by creating a service using the ferdn4ndo/s3-postgres-backup:latest identifier:

services:
  ...
  s3-postgres-backup:
    image: ferdn4ndo/s3-postgres-backup:latest
    container_name: s3-postgres-backup
    env_file:
      - ./.env
    depends_on:
      - postgres # Adjust it to the database service name
                 # so it waits for a healthy state before
                 # backing up the data.
  ...

Configuring

The service is configured using environment variables. They are listed and described below. Use the Summary for faster navigation.

Note that default values suffixed by ¹ mean that they are invalid and must be replaced before running the service, otherwise an error will be thrown during startup.

General Configuration

SCHEDULE

Main backup routine schedule. Uses the CRON Expression Format and the default value is specifically of the Interval type.

Required: YES

Default: @every 6h

ENCRYPTION_PASSWORD

The encryption password is used to encrypt the backup. If the value is empty, the backup won't be encrypted.

Required: NO

Default: EMPTY

DELETE_OLDER_THAN

If a DateTime in ISO format is specified, the backup system will delete backups that are older than the specified DateTime. When empty, no previous backup will be deleted.

Required: NO

Default: EMPTY

TEMP_PATH

The path that is used to temporarily store the exported dump file, compress and encrypt (if set), and upload to S3.

Required: NO

Default: /temp

XZ_COMPRESSION_LEVEL

Dump file compression level from 0 to 9. Compression will be skipped with the values 0 and 1.

Required: NO

Default: 6

BACKUP_PREFIX

Optional prefix to be prepended to the backup filenames.

Required: NO

Default: EMPTY

RUN_AT_STARTUP

If set to 1, will perform the backup as soon as the container startup delay finishes. Otherwise, the backup will be performed only after the main schedule interval.

Required: NO

Default: 1

STARTUP_BKP_DELAY_SECS

Delay interval (in seconds) after the container initialization to wait before entering the main backup routine.

Required: NO

Default: 5

Postgres Configuration

POSTGRES_DATABASE

Postgres database name. If empty, all databases will be exported in the dump file.

Required: NO

Default: EMPTY

POSTGRES_HOST

Postgres connection host

Required: YES

Default: <host>¹

POSTGRES_PORT

Postgres connection port

Required: NO

Default: 5432

POSTGRES_USER

Postgres connection user

Required: YES

Default: <user>¹

POSTGRES_PASSWORD

Postgres connection password

Required: YES

Default: <password>¹

POSTGRES_EXTRA_OPTS

Custom extra arguments passed to the Postgres CLI

Required: NO

Default: EMPTY

AWS S3 Configuration

S3_REGION

AWS S3 Region used to store the backup files

Required: YES

Default: <region>¹

S3_BUCKET

AWS S3 Bucket used to upload the files

Required: YES

Default: <bucket>¹

S3_ACCESS_KEY_ID

AWS S3 Access Key ID used to connect and perform the upload

Required: YES

Default: <key_id>¹

S3_SECRET_ACCESS_KEY

AWS S3 Secret Access Key used to connect and perform the upload

Required: YES

Default: <access_key>¹

S3_PREFIX

AWS S3 path prefix (subfolder) is used to perform the upload. May be left empty.

Required: NO

Default: EMPTY

S3_ENDPOINT

AWS S3 main endpoint URL. Will use the default one when empty.

Required: NO

Default: EMPTY

Testing

The repository pipelines include testing for code leaks at .github/workflows/test_code_leaks.yaml, testing for package vulnerabilities at .github/workflows/test_grype_scan.yaml, testing for code quality at .github/workflows/test_code_quality.yaml, and UTs (which will call the run_*_tests.sh scripts) + E2E functional tests at .github/workflows/test_ut_e2e.yaml, which are described in the sections below.

Unit Tests (UTs)

To execute the UTs, make sure that the s3-postgres-backup container is up and running.

This can be achieved by running the docker-compose.yaml file:

docker compose up --build --remove-orphans

Then, after the container is up and running, execute this command in the terminal to run the test script inside the s3-postgres-backup container:

docker exec -it s3-postgres-backup sh -c "./run_unit_tests.sh"

The script will successfully execute if all the tests have passed or will abort with an error otherwise. The output is verbose, give a check.

End-to-End (E2E) Tests

To execute the ATs (as in the UTs), make sure that both the s3-postgres-backup container and a postgres instance are up and running.

This can be achieved by running the docker-compose.yaml file:

docker compose up --build --remove-orphans

Then, after both containers are up and running, run the test script inside the s3-postgres-backup container:

docker exec -it s3-postgres-backup sh -c "./run_e2e_tests.sh"

The script will execute with success if all the tests have passed or will abort with an error otherwise. The output is verbose, give a check.

Contributing

If you face an issue or would like to have a new feature, open an issue in this repository. Please describe your request as detailed as possible (remember to attach binary/big files externally), and wait for feedback. If you're familiar with software development, feel free to open a Pull Request with the suggested solution.

Any help is appreciated! Feel free to review, open an issue, fork, and/or open a PR. Contributions are welcomed!

Contributors

ferdn4ndo

The acknowledgments are also extended to the original postgresql-backup-s3 contributors:

itbm

Baran Kaynak

Vladimir Polyakov

License

This application is distributed under the MIT license.

About

A docker image to automatically perform periodic dumps of a Postgres server to an S3 bucket. Supports encryption, compression, and restoration.

Resources

License

Stars

Watchers

Forks

Packages

No packages published