As a self-proclaimed geek, I enjoy talking about storage. I love to see what other professionals in different fields do when it comes to their data storage and backup. I thoroughly enjoy mulling over data, backup, and storage solutions.
In the past year-or-two, as I've taken on more responsibility in my life I've often gone months without a second thought to updating, let along checking the integrity of a local data backup. At times, I've completely ignored the need altogether for an all important off-site copy of my data.
I've always been aware of the importance in backing up frequently and the techniques involved in putting a bulletproof system in place. However, laziness took over up until I found myself at a point when I knew my data wasn't secure and should something go wrong - I'd be starting from scratch.
The system I ended up following for the majority of the past few years was risky to say the least. I'd plug in an external drive as often as I could remember and let Time Machine do its thing. I also had a drive with a copy of my iTunes & Aperture libraries at my parents house however this backup was lucky to get updated at a bare minimum of once every 12 months.
The system needed a drastic change in order to prevent a data-loss catastrophe.
What I'm Storing & Backing Up
By no means am I at the higher end of the scale when it comes to data storage requirements. I've got roughly 600GB of 'live' data on my main external WD Passport drive, consisting of a 150GB Aperture library, and almost 500GB of video content.
It's often been said - the only backup solution you can rely on is one that's completely automated. Having a system in place which requires no human interaction is the only way to ensure you're running a bulletproof system - I settled on my current system 6 months ago.
After a thorough review and regular tinkering, I settled on backing up my data using a three step process. Locally, via Time Machine. Remotely, using two 1TB external drives in rotation. To the cloud, backing up with Arq to Amazon Glacier.
The Local Backup
Ever since the 4-drive Drobo was released, I was very eager to pick up a unit for myself to store my ever-fast growing Aperture / Movie libraries on.
I never ended up purchasing a Drobo, as before long, the need for one was eliminated due to large 2TB, 3TB and even 4TB internal hard-drives hitting the market. These large drives eliminated the need for slow, cumbersome, and expensive external storage. The rate I was collecting data wasn't nearly as fast as the rate the drives were increasing in size.
My local backup reverts to my original habit which involves plugging in a Seagate FreeAgent USB powered drive from time-to-time and letting Time Machine kick in automatically.
I've managed to get in a habit of plugging it in at least once a week which is mainly due to the fact it sits in the top of my bag and follows me everywhere I go. If this was my sole backup, I'd be setting myself up for failure. It's the backup regime I explain below which completes the full picture.
The Offsite Backup
I've got two 1TB drives I use in rotation for my offsite backup which take turns heading an hour and a half down the country to my parents house.
I'm very - very lazy in keeping these rotation on a regular basis, however these backups are my in absolute worst case disasters should both the local copy, local backup, and cloud backup all fail. Impossible? Almost - but never-say-never. An offsite backup is a very important piece to any 'bulletproof' backup strategy.
The only items I factor into my offsite backup is my iTunes Library, and Aperture/iPhone catalogues. Backing up iTunes remotely has become less important after the introduction of iTunes Match. I'm still keeping my iTunes Library backup off-site as collecting the drive and transferring 100GB of data, is much quicker and less expensive than downloading it.
Both of these drives are backed up using SuperDuper! I take advantage of the Smart Update feature in SuperDuper, meaning only files that have changed since the last backup are updated. This keeps the time involved at a bare minimum, while providing me with a mirror image of the internal drive.
The Cloud Backup
The speed of broadband in New Zealand is nothing to write home about. The time it's taking for a complete rollout of fibre gives me even less to write home about.
Presently, at best - my broadband connection runs at roughly 8 MBPS down, and 1 MBPS up. The rollout of fibre for my region is set down for late-2014 which theoretically has promised speeds (depending on provider and plan) of up to 100 MBPS.
In saying that, the stability and speed of DSL has improved to a point where backing up to a cloud based service is do-able, provided you've got enough patience for the initial upload to take place.
The system I've put in place for backing up to the cloud is primarily made up of an Amazon Glacier backup. However, there's also a number of other services and applications which store their data remotely in the cloud for easy access. This is a huge positive for extra data redundancy.
Having previously becoming frustrated with the user-interface of Backblaze, I read an insightful review of Haystack Software's Arq by Shawn Blanc, an menu-bar based application used for automating backup to Amazon's servers.
Amazon offers, and I quote 'average annual durability of 99.999999999% for an archive'. It's designed to withstand the concurrent loss of 2 data centres without any effect to your data.
Glacier storage pricing is $0.01 per GB / per month with no charge for any data uploaded. Pricing for S3 is $.14/GB per month (or $.093/GB per month for Reduced Redundancy Storage). In more simple terms, Glacier/S3 storage is dirt cheap. The most recent Amazon bill I received was for just $7.98.
I've altered Arq's settings to automatically kick in every hour and backup my entire home folder in OS X. Arq also backs up my Aperture library (kept on my external storage drive) each time this drive is plugged into my MacBook Pro.
This is my absolutely worst case backup location. I'd only need to spend the time, and few dollars to access the data if my MacBook Pro was stolen or destroyed, along with my local Time Machine backup, and my two external off-site drives. Theoretically, this backup would remain untouched under any data loss circumstance, bar a natural disaster hitting two cities over 100KMs apart.
Dropbox is the backbone of my file system, this is where I store all of my active and archived documents, PDF files, work files, drafts, etc.
While I love the ability to pick up my files online no matter which machine I might be working on at any time, I don't find this is a feature I put to use as frequently as I could. The number one reason I'm so in love with Dropbox is the peace of mind knowing my files stored are continuously backed up and accessible via the cloud.
Dropbox is the most indestructible aspect of my day-to-day file storage. Anything stored in my Dropbox folder is stored locally on my machine, remotely on their servers, on my local Time Machine backup, and the folder is also included in my hourly Arq backup to Amazon Glacier.
iTunes Match isn't something I've necessarily incorporated into my back-up workflow on purpose, but I love the redundancy it offers. All 10,994 items in my iTunes Library are matched and uploaded to Apple's servers with iTunes Match.
While these are also stored locally on my Time Machine drive, and off-site on one of the two 1TB drives I have in rotation. An absolute worst case scenario would still allow me to re-download my entire library directly through iTunes on any machine.
By far my favourite feature of iCloud is the automatic backing up of my iPhone each time it's plugged into a power source. Additionally, a number of my frequently used applications (including Byword and iA Writer) store and sync data using Apple's iCloud service. This adds another level of redundancy to my working files.
If something was to happen to my MacBook Pro, I could simply head down to the Apple Store, pick up a new machine and have it up and running mirroring the file-system of the stolen/lost/destroyed machine within a couple of hours.
- Launch the Mac App Store and re-download all purchases.
- Install Dropbox and sync files.
- Login to iTunes Match and start the download process of my music library
- For my Aperture and iPhone libraries, I'd have the option of connecting my Time Machine drive, or pulling down my photo collection from Amazon Glacier depending on if I was at home, or out and about travelling.
The most important factor in backing up is making sure at least one copy of your data is backed automatically without you having to think about it, or lifting a finger to make it happen.
For me, this 'fingerless and thoughtless' backup is Arq to Amazon Glacier. I don't have to think about this backup taking place, it simply happens in the background as I'm working on my MacBook Pro.
The 'Perfect' Backup
I'm constantly tinkering with various aspects of this backup system. In a perfect world, I'd configure my laptop of choice with (at least) a 1TB internal SSD. (we've only 256GB short of this build-to-order option, the current MacBook Pro can be configured with an SSD up to 768GB in size.)
This would allow me to store my iTunes and Aperture libraries locally on the SSD. I'd purge and delete as much of my video content as I could, keeping only the most important content local.
My local backup would be via Time Machine to the new Time Capsule, and I'd continue to mirror the internal SSD to Amazon Glacier via Arq. I'd continue to rotate two drives off-site with SuperDuper!
I'd do away with the messing around with with external drives when trying to find, which means I'd be able to fully utilise the potential (and the whole point of) owing a laptop.
This backup regime will be constantly changed, as new services, applications, frameworks, and devices are introduced. The next planned change is outlined above, eliminating as much local data as possible, and turning my local backup into something that requires no attention or thought.