Skip to content

Scheduled backups and guided updates for Docker compose stacks

License

Notifications You must be signed in to change notification settings

hazzuk/compose-backupdate

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

86 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

backupdate

Bash script for creating scheduled backups, and performing (backed-up) guided updates on Docker compose stacks.


Why?

Because I needed a tool that was...

  • Simple by design
  • Doesn't require changes inside my compose.yaml files
  • Works with both bind mounts and named volumes
  • Can be used to create 🕑scheduled backups
  • Can also create ad-hoc backups alongside guided container ⬆️updates
  • Not trying to replace existing cloud backup tools (like rclone)

Core Functionality

The core focus of backupdate is in creating archived backups of your Docker compose stacks.

How it works

  1. 🛑Stop any running containers in the Docker compose stack
  2. 📁Create a .tar.gz backup of the stacks working directory
  3. 📁Create .tar.gz backups of any associated named volumes
  4. ⬇️Ask to pull any new container images (-u)
  5. 🔁Restart previously running Docker compose containers
  6. 🗑️Ask to prune any unused container images (-u)

Read the official Docker documentation for more details on "Back up, restore, or migrate data volumes".


Setup

Install

Warning

This script is provided as-is, without any warranty. Use it at your own risk.

Important

The install command and the script must be run with elevated permissions.

bash -c 'curl -fsSL -o /bin/backupdate https://raw.githubusercontent.com/hazzuk/compose-backupdate/refs/heads/release/backupdate.sh && chmod +x /bin/backupdate'

Expected compose directory structure

The script expects your docker compose working directory to be located at $docker_dir/$stack_name:

$docker_dir = "/path/to/your/docker"
$stack_name = "nextcloud"

docker/
├─ nginx/
│  └─ compose.yaml
├─ wordpress/
│  └─ compose.yaml
└─ nextcloud/
   └─ compose.yaml

Options

Command line

Required

  • -b "", --backup-dir "": Backup directory
  • -d "", --docker-dir "": Docker compose directory parent
  • -s "", --stack-name "": Docker compose stack name

Optional

  • -l "", --backup-blocklist "": Volumes/paths to ignore
  • -u, --update: Update the stack containers
  • -v, --version: Check the script version for updates

Environment variables

# backup directory
export BACKUP_DIR="/path/to/your/backup"
# docker compose directory parent
export DOCKER_DIR="/path/to/your/docker"
# docker compose stack name
export STACK_NAME="nginx"
# volumes/paths to ignore
export BACKUP_BLOCKLIST="plex_media,/plex-cache"

Example Usage

📀Backups

backupdate -s "nginx" -d "/path/to/your/docker" -b "/path/to/your/backup"
backupdate --stack-name "nginx" \
    --docker-dir "/very/long/path/to/docker" \
    --backup-dir "/very/long/path/to/the/backup"

Tip

backupdate automatically searches for a compose.yaml / docker-compose.yaml file inside your current directory. Running backupdate inside your Docker compose working directory won't require --docker-dir or --stack-name:

cd /path/to/your/docker/nginx

backupdate -u -b "/path/to/your/backup"

⬆️Updates (manual only)

Note

Stack updates (unlike backups) can only be performed manually. This is by design.

backupdate -u -s "nginx" -d "/path/to/your/docker" -b "/path/to/your/backup"

🕑Scheduled backups

You can create a cron job or use another tool like Cronicle to run something similar to this example script. Which will periodically backup your Docker compose stacks automatically:

#!/bin/bash

# set environment variables
export DOCKER_DIR="/path/to/your/docker"
export BACKUP_DIR="/path/to/your/backup"

# set stack names
stack_names=(
    "nginx"
    "portainer"
    "ghost"
    "home-assistant"
)

# create backups
for stack in "${stack_names[@]}"; do
    backupdate -s "$stack"
done

# upload backups to cloud storage
rclone sync $BACKUP_DIR dropbox:backup

🚫Backup blocklist

By default, backupdate will backup all related named volumes and the stacks full working directory. You can use -l or --backup-blocklist if you want to explicitly exclude certain volumes or paths from the backup.

# ignore the plex_media volume and the /plex-cache directory

backupdate -s "plex" \
    -d "/path/to/your/docker" \
    -b "/path/to/your/backup" \
    -l "plex_media,/plex-cache"
# you'll likely want to set the backup blocklist as an environment variable
# when you need to ignore volumes/paths for multiple stacks

export BACKUP_BLOCKLIST="\
plex_media,\
/plex-cache,\
/nginx.conf,\
nginx_logs,\
/data/ghost.yml"

Tip

To avoid being recognised as a volume, paths must start with a forward slash /. Note that paths are interpreted as glob(3)-style wildcard patterns.


FAQ

How do I restore the backup?

Restoring backups isn't currently a process automated by the script. But as backupdate is just executing standard Docker commands, just follow the official Docker guide on "Restoring volumes from backups".

Do I need to stop containers before running backupdate?

No, backupdate does that for you. All containers are stopped automatically before a backup to ensure the data saved is reliable.

Are there any differences between backups and updates?

Yes, backupdate is focused on backups. However, it can be used to simultaneously backup and update containers in a Docker compose stack. Running update will remove (docker compose down) all containers, perform a standard backup, then request the containers be recreated (docker compose up -d). This is different to backups, where the containers are only stopped, backed up, and then restarted. As updating a Docker container requires the container to be recreated so as to use the newly pulled Docker image.

How does it backup DB volumes?

The script treats all containers and their volumes the same. Shut them down, back them up, then restart them.

There are alternative tools that do specifically handle database backups with Docker. This would probably be most useful if you don't want any downtime. Or want to create a lot more backups each day, then use another tool with deduplication abilities to save on storage space.

Alternative tools