We don't all have fast internet connections. My current one is 2-3mbps up and my previous one was only 0.8mbps up.

Despite that I have close to 2TB backed up with CrashPlan. Sure, it took a while to get everything backed up initially.

But the way CrashPlan works makes it easy to prioritise different files. So for example my documents and source code have the highest priority, then my photos, then my ripped FLAC files and finally my massive virtual machine images.

So when I started using CP all my documents and source code were backed up in the first day or so, then it did my photos and then the rest. And the same priorities still apply, so if I dump a whole load of new photos and FLAC files onto the machine it will do the photos first and then the FLAC files.

CP also lets you backup between machines, at the same time as backing up to their cloud storage. So my laptop is backed up to my home server, while also being backed up to the CP cloud. And the server backs all its data to the CP cloud.

I also have CP running in some of the virtual machines I use for my day-to-day development work, so any code I'm working on there gets backed up on the fly all through the day.

I also throw DropBox into the mix, so that most of my data lives in a good half dozen places. One has to be careful with DropBox though, it is very good at distributing your file management mistakes... (my entire DropBox contents is backed up to two places by CP)

I admire people like who can rigorously apply a manual backup strategy, but it would never work for me. For lazy people like me the backup has to be completely automatic.
_________________________
Remind me to change my signature to something more interesting someday