Thursday, July 27, 2017 [Tweets] [Favorites]

Arq 5.9 Adds Backblaze B2 and Wasabi Support

Stefan Reitshamer:

Now you can back up with Arq to Backblaze’s B2 storage! It’s a super-cheap option ($.005/GB per month) for storing your backups.

This compares with $0.0125 for Amazon S3 Infrequent, $0.004 for Amazon Glacier, and $0.007 for Google Coldline. Wasabi, which is new to me, charges $.0039.

An under-appreciated Arq feature is that it also supports directly connected hard drives. This makes it a good accompaniment to clones (which lack versioning and are space-inefficient) or Time Machine (which tends to corrupt itself).

I now back up only the most important files and most recently added photos to the cloud. I found that for big restores I always wanted to use local backups, anyway. Reducing the size of the cloud backup set makes it more likely that the files will be backed up promptly. New files aren’t waiting for old ones (which already exist on local backups) to upload, and backups aren’t halted for as long when Arq does its maintenance.

Previously: B2 Cloud Storage.


Adrian Bengtson

The ability to also backup to hard drives, both directly connected and locally as network connected drives, with a single solution is one of the features I like about Crashplan. (There are other issues with Crashplan so I'm not happy with everything about it.)

Me too. I just bought another year of CrashPlan but the price has been consistently doubling in the face of decreasing storage costs elsewhere. There are a lot of features unmatched by other backup software in one place, still years after the initial version, which seems really odd, but…

I've read from Michael Tsai and others that Arq has performance issues with large backup sets, but never really understood how large is large. I don't back up my entire drive with CrashPlan — just important files; everything else just gets local SuperDuper! clones to alternating disks. I used to have a local backup but I just retired that as a local NAS is good enough (and lets me make a RAID1 of the target volume so I don't have to clone the CrashPlan backup volume the way I used to!)

Here is what my family is currently backing up to CrashPlan, which doesn't *seem* that gigantic compared to some people I know, but…

iMac (family member's) 631896 files, 495.4 GB
MBP (family member's) 265737 files, 107.6 GB
Mac mini (family server) 254380 files, 47.6 GB
Mac mini (mine) 975186 files, 282.4 GB

Total 2127199 files, 933 GB

If anyone has used Arq to back up sets around this size, I'd love to know! If not I guess I'll try it myself.

@Nicholas CrashPlan also has issues with large backup sets. I had a few TB in it and had to massively increase the RAM allocation and go through multiple week-long prunings on the server to work around an issue where it simply stopped backing up. The support folks were very helpful, but I realized that with five years or so of backing up most of my main drive, and keeping lots of versions, I was really pushing the limits.

You could do four separate Arq destinations, and my guess is that would work fine. Caveat: Arq 5 changed some things and introduced problems that made validating/pruning really slow (as in days for hundred-GB backup sets, with all backups paused while it does this). I’m not sure whether that’s fixed yet. Right now my main Arq cloud backup is only 50 GB, and that validates/prunes fast enough that I don’t even notice.

@Nicholas 200 GB with Arq + S3, seems to be running fine every day.

Adrian Bengtson

The problems with large backup sets is one of the main issues I have with Crashplan. There's some kind of low level re-syncing that takes days to complete, and during that time no proper backup is done, as far as I understand.

> Reducing the size of the cloud backup set makes it more likely that the files will be backed up promptly.

Arq lets you _detach_ a backup (it's a small button on the right-hand side).

Let’s say I want to backup my Documents folder and my very huge iTunes Music folder (or the even a videos folder). My Documents folder is changing frequently, but the Music folder will see only a few of changes over several months. Now what happens every hour is that the scan of the huge Music folder will delay the backup of the Documents folder. I think this is the situation you have in mind.

In such a case I could do this:

I create a separate backup for each (to the same destination) and let the backups complete. Once completed I detach the Music folder backup. That means my Music folder will stay in the cloud but it doesn't receive any updates and thus no hourly scans.

Once every month or so I could re-attach the Music backup to bring it up to date.

For me, the major annoyance with Arq is that it doesn't let me backup specific subfolders without auto-adding new folders in the same parent path. An example:

I have a backup for ~/Library. But I want to backup only certain folders there, for example ./Scripts, ./Services, ./Application Support/LaunchBar and a couple more. What I'm doing is clicking "Edit backup selections", then unchecking everything and checking only the needed folders. Fine so far. But for example when I install a new app it is likely that it will create a new folder in Library or Application Support, and that folder gets auto-added to the backup set. But of course I don't want it to backup.

There's no way to tell Arq "backup only the checked folders and nothing more".

With CrashPlan the solution is very elegant: If I check a parent folder and then uncheck some subfolders, that means new folders will be auto-added (the same behaviour as with Arq). However when I uncheck the parent folder and then re-check some subfolders, it means that only the checked folders will be backed up and newly arrived folders will be ignored. Voilà.

The only way in Arq to backup only selected subfolders is to create an individual backup for each. This is very clumsy to set up and (probably) also slows down the scanning/backup process.

(Some time I ago I suggested the CrashPlan approach to the author of Arq, but he thinks the main purpose of Arq is to backup _everything_, thus the forced auto-adding of new folders is WAD and an UI option to auto-exclude new folders would be dangerous.)

– Tom

@Tom Yes, that’s kind of what I’m thinking of. You can also just set a backup to only run manually instead of detaching. But neither makes the initial backup easy. In your example, I don’t want to disable the Music backup because I want it to make progress every day and eventually complete. But I don’t want to enable it and starve the Documents backup, either. The only solution seems to be manually flipping the switch multiple times per day.

Yes, I think the CrashPlan way could potentially cause people to accidentally exclude items. But it is sometimes useful to be able to do that. I think the CrashPlan UI could be clearer here. Instead of having the same checkbox show two different things, perhaps there should be another column to show whether new subfolders are excluded. And that would make accidents less likely.

[…] months ago, I realized that I could no longer rely on it for long-term history and started using local Arq backups for that. However, I’d like to find another solution (other than clones and Time Machine, […]

Stay up-to-date by subscribing to the Comments RSS Feed for this post.

Leave a Comment