I'd be looking at placing an ORB on each machine serving up. Then place a fast server in the system, either stand-alone or virtuallised between several physical machines. On this machine you could put up a custom CGI app written in either JAVA or C++ using CORBA to pull the data sources under a common access point. With the info distributed like this you can look at parallelism, load balancing, etc.
Alternatively, run up JSPs on a server, with Servlets on different data source machines providing access to each data source. The JSP applications can access the servlets according to required content, or serve up pages to the clients that interrogate the servlets seperately.
The biggest problem you seem to have is in the different delivery rates from different sources: you could handle that by composing the UI delivered to the user of servlet clients; the clients contact a specific servlet, registers an interest, and then waits for updates to be pushed back to it ('Server Push" rather than "Client Pull"). Works very well if you get it right.
With the data source integration issues, you are touching on two problems and you should analyse them seperately. There is constant-load external access, and non-intrusive data integration. Both are pretty standard solutions provided off the shelf by the major dB engine manufacturers. Multi-source updating was something I had to deal with for a telephone info system where around 100 agents were continually accessng the dB as it was updated. We handled it with buffered processes and a common access point for all requests. It meant that the agents had to wait and have lower performance accessing the dB, but it ensured the updating/integration took priority with no loss, which was our design criterion.
_________________________
One of the few remaining Mk1 owners...
#00015