Introduction

Since 2017 I was using SpiderOak for backups of my laptop. While the service has gotten better since I first tried them, they still have some issues which kept me looking for alternative .

I was using SpiderOak because they are one of the only all in one solutions that allow local encryption and specifying what directories to backup. This is a must for me because I don’t want my entire home directory backed up. There are a lot of temporary and working files that I want excluded. Most systems don’t give you a choice and will only backup your home directory.

Problems with SpiderOak

Overall, SpiderOak wasn’t bad (anymore) but there were a few annoyances that made me start looking into what else is out there.

Backups not working

The biggest problem with SpiderOak is their desktop app. The desktop app would stop working and stop backing up. This coincided with new releases of the app. I’m not sure if they were making incompatible changes server side or if it was something else. Either way updating the app would be required.

At the time the SpiderOak One app (for personal backup) was not in the app store. They’ve since rebranded it as CrossClave and earlier this year they put it into the App Store. Which now solves this problem. However, previously, the app didn’t provide a notification about new versions being available within the app. So the only way I’d know I need to install a new version was to wait for backups to stop working.

On top of that, there was no notification of backups failing. So I’d have to constantly check if backups were working and if there was a new app version that needed to be install.

Price

The other issue with SpiderOak is their pricing. $11 per months for 400 GB of storage when I was using it. This is very expensive compared to competing backup providers. Even using AWS S3 directly is cheaper than SpiderOak for 400 GB of storage. Which itself is more expensive than some dedicated backup services.

New Solution

There are a few requirements that I used to look at other options.

  1. Runs on macOS
  2. Locally encrypt files before uploading
  3. Cheaper than SpiderOak
  4. Support file snapshots with each backup run
  5. Ability to restore individual files
  6. Allows choosing what directories to backup
  7. Deduplication
  8. Run on demand not as a background process

Additionally but not required is the ability to exclude files from the backup. My blog creates a number of temporary files and I’d rather have the backup skip the “build” and cache directories instead of me needing to remember to delete them before running a backup.

My first thought was to use duplicity which I had previously scripted for backing up my server. It’s fine but a bit cumbersome. I also looked at few commercial solutions and while some might have worked, I still wasn’t too happy with them for one reason or another.

Ultimately I found and settled on restic. I’ve been using it for several months and it has been working very well. It’s been the best experience I’ve had with any backup system. It’s fast, intuitive, easy to use, and I haven’t run into any issues.

For storage I decided to use Backblaze’s B2 cloud storage. It’s cheaper than AWS S3 and is supported natively by restic. So far I’ve had no problems with B2 and it’s been a solid offering. The price has been great too. With about 250 GB of data I’m paying less than $1.50 a month. Far cheaper then any backup services I’ve seen.

Scripting

Of course I scripted the backup process to make it exceedingly easy to run. The script will run the backup then remove old snapshots. I only want to keep a small number of them. The files don’t change very often with the majority being new blog posts. So I only need to backup once a month.

backup.zsh

# Backup location and access info
LOCALLOCATION="$HOME/<BACKED UP DIR>"
REMOTELOCATION="<B2 BUCKET AND DIR>"
B2KEYID="<API KEY ID>"
B2APPKEY="<API KEY SECRET>"

# How many snapshots to keep
KEEP_LAST=6
EXCLUDEFILE="backup_excludes.txt"


# Prompt for the repo password
read -s "PASS?Repository password: "
echo ""

# Export data restic will read from the env
export B2_ACCOUNT_ID=$B2KEYID
export B2_ACCOUNT_KEY=$B2APPKEY
export RESTIC_PASSWORD=$PASS

# Create a repo. Only needs to be done once.
#restic init -r "$REMOTELOCATION" -v --repository-version latest

# Backup
echo "Starting Backup..."
restic -r "$REMOTELOCATION" -v backup "$LOCALLOCATION" --exclude-file="$EXCLUDEFILE"
echo "Finished Backup..."

# Keep only the last X snapshots. Prune gets rid of any old files not referenced by any snapshot.
echo "Starting removal of old snapshots ..."
restic -r "$REMOTELOCATION" -v forget --keep-last "$KEEP_LAST" --prune
echo "Finished removal of old snapshots ..."

unset B2APPKEY
unset B2KEYID
unset RESTIC_PASSWORD

This is the excludes file I’m using to skip backing up cache files.

backup_excludes.txt

.DS_Store
Blog/.jekyll-cache/*
Blog/.sass-cache/*
Blog/_site/*
Blog/_search_index/*

Conclusion

Unlike previous systems I’ve used, which has always had something I didn’t quite like, I’m very happy with restic and B2. I’ve been using it for about 10 months and it does everything I want and does it better than anything else I’ve tried. The price for B2 storage has also made it a very compelling solution.