I just find it hard to believe nobody else has run into this kind of problem, though.


Not many webmasters would want to do what you are doing. The vast majority are probably single site people who don't really care about automated syncing. The rest are probably working for companies who wouldn't consider ftp to be a valid mechanism for site updating anyway.

Do you have ssh access? It's going to be convoluted without some kind of shell access. What you really want to do is set a script to ssh to each host, tunnel the rsync ports back to you and then rsync to your machine.
Without some kind of shell access it's going to be difficult to ascertain which files need uploading. You could use wget to mirror each site and then do a recursive diff against your working directory, and then ftp the necessary files. But diff isn't necessarily parse friendly. Furthermore, assuming that you are on a symmetric ISP connection, it would be quicker to brute force upload the entire site every day anyway.

Of course, the most elegant non-shell solution would be a perl/php/cgi page that provides you with md5sums of every file on the site. md5sum your working directory, compare with results from that page and upload the differences. You'll need to ensure that new (and possibly deleted files depending on preference) are handled correctly. A side benefit of this is that you can sanity-check the site after upload as well by repeating the md5sums. This solution probably doesn't scale particulary well though - md5summing can take a while.
_________________________
Mk2a 60GB Blue. Serial 030102962 sig.mp3: File Format not Valid.