Skip to content

Instantly share code, notes, and snippets.

@icefo
Forked from dublx/retry.sh
Last active September 4, 2019 11:24
Show Gist options
  • Save icefo/07aab2789e5cfa71045343953aaf8842 to your computer and use it in GitHub Desktop.
Save icefo/07aab2789e5cfa71045343953aaf8842 to your computer and use it in GitHub Desktop.
Backup nested zfs volumes to Backblaze with duplicacy
#! /usr/bin/env bash
function tryUpload() {
(duplicacy backup -limit-rate 820 -threads 8 -stats) &&
(duplicacy prune -exclusive -keep 360:360 -keep 30:180 -keep 7:30 -keep 1:7)
return $?
}
date
(
# Wait for lock on /var/lock/.myscript.exclusivelock (fd 200) for 10 seconds
flock -x -w 10 200
if [ $? -eq 0 ]; then
# exit on error
set -euo pipefail
if [ -z "$(zfs list -r -t snapshot -o name -H -S name tank/backups | grep tank/backups@snap1)" ]; then
zfs snapshot -r tank/backups@snap1
zfs list -r -t snapshot -o name -H -S name tank/backups | sed -r 's/^(tank\/backups)(.*)(@snap1$)/\1\2\3 tank\/backups\/.backup-dup\2/' | xargs --max-args=2 zfs clone -o readonly=on
echo "snapshot & clone created"
else
echo "Previous backup was interrupted (power loss or shutdown), trying to resume backup"
fi
cd /mnt/disk0/backups/.backup-dup
retry=0
maxRetries=38
retryInterval=60
until [ ${retry} -ge ${maxRetries} ]
do
tryUpload && break
retry=$[${retry}+1]
echo "Retrying B2 upload [${retry}/${maxRetries}] in ${retryInterval}(s) "
sleep ${retryInterval}
done
if [ ${retry} -ge ${maxRetries} ]; then
echo "Failed after ${maxRetries} attempts !"
date
exit 1
fi
cd /mnt/disk0/backups
zfs destroy -r tank/backups/.backup-dup
zfs destroy -r tank/backups@snap1
echo "snapshot & clone destroyed"
date
exit 0
else
echo "backup-dup_B2 is already running"
exit 1
fi
) 200>/var/lock/backup-dup_B2.exclusivelock
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment