In the (very) boring world of backing up your computer(s), something genuinely cool happened recently: it finally became feasible to back up the whole computer to the cloud. Feasible, at least, for people with a first-world income and a decent Internet connection. (Sorry, bandwidth-challenged US residents! (Where by sorry, I of course mean neener neener.))
Until the end of 2012, there were two problems with trying to back up all one’s bits to the cloud:
- too fucking slow, or:
- too fucking expensive
Happily, it is now 2013, and both of those problems have been ameliorated!
For several years, there have been cheap solutions that purport to be able to back up your computer to the cloud: Mozy, Backblaze, CrashPlan, etc. But in my real-world experience, none of these are actually capable of backing up more than a few piddlebytes of data. CrashPlan is said to be the best of the consumer-priced services, but the only time I ever used it in real life, it took more than a week to restore a mere 60GB of files (and, it fucked up all my folder mod dates for good measure).
It’s debatable whether the software ‘worked’ in that case. Let’s be generous, and say I didn’t need my data in a hurry and didn’t care about restoring the metadata accurately, so it ‘worked’ for backing up a 60GB home dir. Still, there is no fucking way you can say it ‘works’ when confronted with a whole computer’s worth of data:
If your backup has been running for weeks, and still has 11.1 months remaining, it is not going to be very helpful when one of your hard drives dies next month.
The size of “a computer’s worth of data” depends, of course; my Mac Pro holds a lot more data than my notebook, which in turn has a lot more than my sister’s iMac. But I’ll use my creaky old Mac Pro at home as a guide:
The pain of the caveman backup systems of yesteryear is reflected in my drive/partition scheme. The whole point of the JUNK volume is to store shit that doesn’t really need to be backed up: Internet downloads, pr0n, anything easily replaced. Then I have a (poorly-named) BACKUP volume for OS X’s built-in Time Machine snapshot system.
So what really needs backing up? SSD and STUFF. Potentially 3TB and change, although STUFF is actually only using 1.12TB of data right now.
Until recently, I’ve used the previously raved about backup tool Arq, to encrypt and back up my data to Amazon S3. Arq has a fairly efficient storage scheme, and it does a decent job of de-duplicating the data prior to transmission, so actual storage on S3 is substantially less than the actual amount of data being backed up: a month’s worth of daily backups of my 1.12TB STUFF drive above actually uses 836.5 GB of storage in Amazon’s cloud. (UPDATE: Oh oops, the actual size is 893 GB, so dedup is only saving 60GB. But still.)
That’s nice, but still, 836.5 * $0.10/GB/month * 12 months == $1,004 per year to back up all this stuff. Maybe that would make sense if this computer was storing the irreplaceabe manuscript for my three-hundred-million-page novel. But what is actually stored on this Mac is mostly gigabytes of multi-angle video footage of me cooking, and shit like that.
So since that was too expensive, I didn’t do that. I just dealt with having local backups and spent time futzing around with them every once in a while, getting more hard disks, configuring rsync to clone my shit to a spare machine at work (which is technically off-site, but not off-site enough to withstand a Godzilla attack on Tokyo), then having to debug why rsync would sometimes fail, etc.
Then one magical day, all that shit was solved.
Amazon came out with Glacier, their slow-by-design, long-term archival-oriented version of S3. Arq, naturally, released an update to support it.
The end result is: now I just back up all my shit to the cloud. The end.
$10.98. That is the total AWS charge for the month of March 2013, to back up that 1.12TB terabyte-or-thereabouts. So, roughly $132 a year not to have to think about it.
Well worth it to me.
以上