I've been using DigitalOcean recently for backups, and they work fantastically. The price of $1 is hard to beat, but the interval is less than ideal for me (once a week). I wanted some nightly backup storage, and I ran into Wasabi.

Wasabi charges $5.99 USD for 1TB of storage, minimum 90 day retention period (which is fine for me). It's hard to beat that price for 1TB, flat-fee.

It's Object Storage

Wasabi is just Object Storage with S3-compatible API. It's perfect for backing up my servers, and using restic (https://restic.net/), I can specify some backup folders, and encrypted passphrase, and it'll be fully encrypted then transmitted to Wasabi.

Setting Up Restic

In order to setup Restic on Ubuntu 16.04.6 LTS and 18.04.x LTS, I used snap:

snap install restic --classic

Once installed, you'll want to head over to Wasabi, and create a Bucket. For example, I have one bucket per server right now (I've inquired to see if 1TB pricing is per bucket or not, assuming it's per-bucket I'll condense my buckets down to just one - as I only need to backup 500GB of data).

Creating the Restic Keys file

As root, you'll want to create a .restic-keys file, this will look similar to:

export AWS_ACCESS_KEY_ID={{ KEY_ID }}
export AWS_SECRET_ACCESS_KEY={{ SECRET_KEY }}
export RESTIC_PASSWORD={{ RANDOM_LONG_ENCRYPTION_PASSPHRASE }}

We define the RESTIC_PASSWORD, so it will automatically use this phrase to encrypt it. Warning: Write this down! If you loose it, you lose access to your backups. I store all mine in my KeePass database.

Creating the automated backup scripts

I dropped a file in each root directory called "backup.sh", this will backup just the folders I care about, and will automate this with a crontab. Here's what mine looks like (adjust to fit your needs):

#!/bin/bash

source /root/.restic-keys
export RESTIC_REPOSITORY="s3:s3.wasabisys.com/{{ HOSTNAME }}"

echo -e "\n`date` - Starting backup ({{ HOSTNAME }})...\n"

restic backup /etc
restic backup /root --exclude .cache --exclude .local
restic backup /home/mike --exclude .cache --exclude .local
restic backup /home/jumpsvcacct --exclude .cache --exclude .local
restic backup /var/log
restic backup /opt

echo -e "\n`date` - Running forget and prune...\n"

restic forget --prune --keep-daily 7 --keep-weekly 4 --keep-monthly 12

echo -e "\n`date` - Backup finished ({{ HOSTNAME }}).\n"

In this, I backup the entire /etc directory, the /root directory, my user account, the only ssh-enabled account (jumpsvcacct) so my jump-box can retain access, the logs directory, and occasionally I'll have some apps installed in /opt (eg. datadog, do monitoring, etc), so I back this all up. On average it's 1.7GB or everything except my home + jumpsvcacct home directories.

Automating it!

The last step is automating this, I want to automate it so it won't wreck the IO, but will do the entire backup. Using crontabs, we can do this effectively. Plunge this into your crontab:

0 4 * * * ionice -c2 -n7 nice -n19 bash /root/backup.sh > /var/log/backup.log 2>&1

This uses ionic, and nice. It will execute /root/backup.sh and pipe the logs to /var/log/backup.log so we know what happens. I haven't setup email alerts yet as I want to get Prometheus to read the logs and alert me from that node... but it's a step in the right direction.

cloud engineering wasabi

Mike

Senior Software Engineer, Labber, Sysadmin. I make things scale rapidly. Optimize everything.

Read More