Home / Automating Docker Volume Backups
Self Hosting

Automating Docker Volume Backups

Backing up production databases regularly is very important. I am self-hosting Leanote, an open-source note-taking application server that required some kind of automated daily backups.

Docker Background

Docker stores volumes in the /var/lib/docker/volumes directory. The naming convention for docker volumes is <directory name="">_<volume name=""></volume></directory> where the directory name is the name of the directory containing docker-compose file and volume is the volume name as specified in the docker-compose file. On Linux based systems, each volume directory is directly accessible from a root account. This makes for a simple backup process.

Automating Docker Volume Backups (Basic Backup to Git)


Advertisement Begins

Advertisement End


To backup a volume, we can simply compress the volume directory using tar and then back up the archive file to version control1.

To set up automated backups of your important data:

  1. Copy and paste the script below to a new backup.sh file in your ~/backups directory.
  2. Run git init inside your backups directory (and set up a remote link for external backups)
  3. Run crontab -e and append the following line: 0 1 * * * /bin/bash /home/pi/backups/backup.sh This runs our backup script daily at 1am.

Kitchen Multi-Timer Pro

Now you’re cooking

Multi Timer Pro is your ultimate meal prep companion, keeping track of multiple cooking times and making adjustments on the fly. Give it a try today and become a better home cook!

Get it on Google Play


Backup Script

#!/bin/bash
#Purpose: Backup docker container/s
#Version 1.0

# BASIC CONFIG START
items=(mongo)                           #space separated list of words. Used in file names only.
vol_names=(leanote_data)                #space separated list of volume names. Same order as items array.

DESDIR=/home/pi/backups                 # backup directory

# BASIC CONFIG END

# CUSTOMIZE THESE
TIME=`date +%m-%d-%y-%H-%M-%S`
FILENAME=$TIME-BACKUP.tar.gz
SRCROOT=/var/lib/docker/volumes
# CUSTOMIZE END

cd $DESDIR
for i in "${!items[@]}"; do
  echo "[$i]: Backing up ${items[$i]} (Volume: ${vol_names[$i]}) -------------------------- "
  ITEM=${items[$i]}
  SRCDIR=$SRCROOT/${vol_names[$i]}
  DIR=$DESDIR/$ITEM/$ITEM-$FILENAME
  echo "     Source:      $SRCDIR"
  echo "     Destination: $DIR"
  sudo tar -cpzf $DIR $SRCDIR
  echo "Content Listing (and integrity test):"
  tar -tzf $DIR
  git add $DIR
  git commit -m "$ITEM backup $TIME"

done

# Push all commits at the end
git push

This script compresses a given volume, moves the resulting archive to a subdirectory in backups and commits that file to version control. You can use this same script to backup multiple volumes by adding more elements to the items and vol_names arrays.

Congratulations! You can now rest assured that your data is backed up automatically. To confirm backups work, check your git repository or your local mail server. Cron sends output logged to STDOUT to the user executing the script (pi@raspberrypi). If your Cron logs show mail delivery errors, then you need to install postfix.

Access Cron emails using the mutt command (install if unavailable). Mutt provides a simple way to check the script outputs and confirm it is working as expected.

Do not stop here! Try this script in a non-production environment and restore a backup of some test data (see next section). {: .notice–danger}

Docker volume backups to external hard drive and AWS

See the following modified script to backup to a harddrive location and AWS instead. You can set up a lifecycle rule to automatically delete backups older than 30 days. Some sort of lifecycle is required as to not exceed the free usage limits.

The script stops all containers using the specified volume before taking a backup. Run this script periodically using CRON during the night.

#!/bin/bash
#Purpose: Backup docker container/s
#Version 1.1
#START

# BASIC CONFIG START
items=(gitlab_data gitlab_db prometheus_data grafana_data )                           #space separated list of words. Item is descriptive, used in file names only.
vol_names=(gitlab_data gitlab_db prometheus_prometheus_data prometheus_grafana_data)                #space separated list of volume names. Same order as items array.

DESDIR=/mnt/IMATION/backups/ubuntu                 # backup directory

# BASIC CONFIG END

# CUSTOMIZE THESE
TIME=`date +%Y-%m-%d-%H-%M-%S`
FILENAME=$TIME-BACKUP.tar.gz
SRCROOT=/var/lib/docker/volumes

pushd $DESDIR

for i in "${!items[@]}"; do
  echo "[$i]: Backing up ${items[$i]} (Volume: ${vol_names[$i]}) -------------------------- "
  ITEM=${items[$i]}
  SRCDIR=$SRCROOT/${vol_names[$i]}
  DIR=$DESDIR/$ITEM/$ITEM-$FILENAME
  echo "     Source:      $SRCDIR"
  echo "     Destination: $DIR"
  CONT=$(docker ps -a --filter volume=${vol_names[$i]} --format "{{.ID}}")
  echo "Stopping container ${CONT} using ${vol_names[$i]}"
  docker stop ${CONT}
  docker run -v ${vol_names[$i]}:/volume -v$DESDIR/$ITEM:/backup --rm loomchild/volume-backup backup $ITEM-$FILENAME
  echo "Starting container ${CONT} using ${vol_names[$i]}"
  docker start ${CONT}
done

echo "The following files will be removed (older than 7 days)"
find . -type f -name '*.bz2' -mtime +3 -exec echo {} \;
find . -type f -name '*.bz2' -mtime +3 -exec rm {} \;

popd

#/home/daniel/.local/bin/aws s3 sync $DESDIR s3://bucket-name --delete

MY MISSION

This blog started nearly 10 years ago to help me document my technical adventures in home automation and various side projects. Since then, my audience has grown significantly thanks to readers like you.

While blog content can be incredibly valuable to visitors, it’s difficult for bloggers to capture any of that value – and we still have to work for a living too. There are many ways to support my efforts should you choose to do so:

Consider joining my newsletter or shouting a coffee to help with research, drafting, crafting and publishing of new content or the costs of web hosting.

It would mean the world if gave my Android App a go or left a 5-star review on Google Play. You may also participate in feature voting to shape the apps future.

Alternatively, leave the gift of feedback, visit my Etsy Store or share a post you liked with someone who may be interested. All helps spread the word.

BTC network: 32jWFfkMQQ6o4dJMpiWVdZzSwjRsSUMCk6

Restoring a Volume Backup

Before you relax and let your backup script do its work, it is important you convince yourself that the resulting archive contains not only the correct files but that they are picked up correctly by Docker when extracted and moved back into the /var/lib/docker/volumes directory.

Run the backup script, and then use the script below. We can extract this archive using sudo tar -zxvf <archive></archive> command. This reproduces the same directory structure where the files were originally located. In our case, var/lib/docker/volumes/<volume name=""></volume>. To restore the volume, move the <volume_name></volume_name> directory into /var/lib/docker/volumes.

# cd into extracted file
cd var/lib/docker/volumes
mv / /var/lib/docker/volumes

Continue your adventure here

Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link