On Mon, 22 Jul 1996, Dorian R. Kim wrote:
Since I don't follow HTTP evolution, so I could be missing something, but for distributed web service that is transparent to users, you'd need something that keeps track of RTT at the client level that has some persistance.
Nope. You only need the a form of redirect that is transparent to the end user so their bookmark files etc... will always refer to the original master controller site. And some similar mods to webcrawlers and caches. The actual analysis of topology would be separate from serving up pages and would likely use more than one technique depending on the situation. Of course all this is hypothetical right now and would have minimal impact on network operations except for XP operators who have to build out more colo rack space.... Michael Dillon - ISP & Internet Consulting Memra Software Inc. - Fax: +1-604-546-3049 http://www.memra.com - E-mail: michael@memra.com