Backup Your Data to Scaleway S3 using Bulkstash
This article comes after a question from Adrian: “… I understand Portainer’s backup only works for its own settings, i’d like to know how can i set backup of my other existing docker instances to a 3rd party like S3 for instance?”
We have been using Portainer for a couple of years now, and here is what we use on a more “global” level to backup our hosts /data
folder (with some exclusions) to an S3 (We use Scaleway.com which is very reliable and cheaper than other alternatives for our use-case).
We use Bulkstash, with rclone under the hood.
So, to Backup your data to scaleway s3 using bulkstash and rclone, read on.
This approach requires that all your “backup eligible data” be stored under /data
in your host node.
That means all containers (even portainer’s) are mounting some specific folder /data/blabla/lalala/drumdrum
.
TIme for docker compose files 🙂
# file: docker-compose.yml
# requires access to mount point: /data (modify as needed below)
version: '3.7'
services:
bulkstash:
image: "openbridge/ob_bulkstash:latest"
restart: always
environment:
# Note: This is a single line JSON string.
# Update it with your config (s3, names, passwords, keys...)
{'BACKUP_FREQUENCY': '1h', 'REMOTE_TARGET_DIR': 'node_hostname--192.168.1.12', 'RCLONE_CONFIG_S3REMOTE_TYPE': 's3', 'RCLONE_CONFIG_S3REMOTE_PROVIDER': 'Scaleway', 'RCLONE_CONFIG_S3REMOTE_ACCESS_KEY_ID': 'MY_SECRET_ACCESS_KEY', 'RCLONE_CONFIG_S3REMOTE_SECRET_ACCESS_KEY': 'MY_SUPER_SECRET_KEY', 'RCLONE_CONFIG_S3REMOTE_ENDPOINT': 's3.fr-par.scw.cloud', 'RCLONE_CONFIG_S3REMOTE_REGION': 'fr-par', 'RCLONE_CONFIG_ENCDRIVE_TYPE': 'crypt', 'RCLONE_CONFIG_ENCDRIVE_REMOTE': 'S3REMOTE:folder-name-automated-backups', 'RCLONE_CONFIG_ENCDRIVE_FILENAME_ENCRYPTION': 'standard', 'RCLONE_CONFIG_ENCDRIVE_DIRECTORY_NAME_ENCRYPTION': 'false', 'RCLONE_CONFIG_ENCDRIVE_PASSWORD': 'VERY_LONG_ENC_PASSWORD', 'RCLONE_CONFIG_ENCDRIVE_PASSWORD2': 'VERY_VERY_LONG_ENC_PASSWORD'}
volumes:
- /etc/localtime:/etc/localtime:ro
# You may mount your host's root (/) to your container's /host and backup other paths
#- /:/host
- /data:/data
entrypoint: |
bash -c 'bash -s <<EOF
trap "break;exit" SIGHUP SIGINT SIGTERM
sleep 1m
while /bin/true; do
echo -e "ENCDRIVE:$$REMOTE_TARGET_DIR"
# Note: There are example excludes below, make sure to adapt to your use-case.
rclone -v copy -L /data "ENCDRIVE:$$REMOTE_TARGET_DIR"/ --exclude */lib/docker/** --exclude */mongodb/** --exclude */db/** --exclude */.git/** --exclude "*/letsencrypt/{csr,keys}/**"
sleep $$BACKUP_FREQUENCY
done
EOF'
## Uncomment and adapt below if your are using SWARM docker
#deploy:
# mode: global
# placement:
# constraints: [node.platform.os == linux]
# labels:
# - "com.centurylinklabs.watchtower.enable=true"
labels:
# This label is used to keep the bulkstash's container up-to-date via Watchtower (Stay tuned for an other article where we will share this feature)
- "com.centurylinklabs.watchtower.enable=true"
To use this docker-compose.yml
file, use Portainer’s methods (web UI editor, git repo…) more on that if needed. Let us know in the comments.