Still happy enough here.

Like Andy I found many other solutions did not seem to handle large numbers of files/data. That was even on more enterprise levels a few years back when I was looking for a solution for my old company.

Took about 6 months to get about 800GB uploaded to their backend. The deduplication works quite well too. Currently I'm at 959GB and the server reports 672GB, so there's about 30% compression/deduplication in there.

Originally I only was looking at about 2-300GB but with the unlimited plan I just added more stuff that doesn't change at a lower priority and eventually it was all up there. I do backup VMs but hadn't bothered doing anything particularly special.

In the time I've been using it (about 1 year now), there hasn't been a single update I don't think. There are a few quirks and inflexible things but really I find it's set and forget. It just works(tm). Most of the things I would prefer to change are more related to the initial backup stage. e.g. if you have two backup jobs and a local and remote server for the high priority one but only the remote server for the lower priority job, it will do the local high priority one first and then the lower priority job remotely. I realise it's trying to do make sure all the data is backed up to at least one place first, but that's not really what I wanted initially. I was able to work around it by fiddling with some priorities. I wanted the higher priority backed up remotely first (most important stuff like photos etc). This is still an issue if I have a large amount of new data in the lower priority backup. It doesn't do the high priority/remote server stuff first.

Every now and then I do an audit to make sure everything's there and apart from the initial issue with the code page (more my Linux box's faulty), it's all been good. Need to do another one actually.

One thing that does occasionally annoy me (particularly on my work machine) is that the CrashplanService.exe + CrashplanDesktop.exe (if open) processes seem to consume a slightly excessive amount of CPU. My work machine (a Dell Precision Workstation) is a few years old but not that underpowered. The corporate coreload does seem to have some wierdness though (e.g. a sometimes Remote Desktop session is unusable because it seems to be spread 1% CPU around a lot of processes. Go and log on locally and it comes good straight away).

A couple of times I've found my remote server e.g. atls5.crashplan.com has "disappeared" and it couldn't connect. At least twice that was due to network issues at their end. Whilst it is kind of sold as a cloud, it does seem that specific machines (not accounts) are assigned to specific backup servers. They do rotate around and I've seen at least three or four different servers assigned to my NAS backup account at various times.
_________________________
Christian
#40104192 120Gb (no longer in my E36 M3, won't fit the E46 M3)