Looking at some of the posts from everyone, I just want to post a couple of off topic questions just for opinions or answers.

If you have a situation with a remote server (5 seconds to respond per request) containing half the data needed in a response. This server is XML. And you have a front end system closely coupled to a database able to access the other half of the data fairly quickly. The information on the remote server has a lifetime of around 30 minutes before it changes. Customer requests go through the fast server, but all the requests require a response containing the other half of the relevant data from the far and slow server. What's a nice efficient way of sending back mostly fast consistent responses? Ideas? I've heard of reverse proxy caching, but don't know much about it.

The other question I have is having to do with large data stores. We have a situation where a few million records are touched, inserted, deleted, updated, etc per day. Ideally, the database would hold about 400 million records, of which a few million will change daily. The changes arrive in spits and spurts via flat file, and takes half a day to load. However, requests for this data is constant. What's a good suggested architecture to allow for uninterrupted read access from those tables without disrupting one or the other? Clues?

Calvin