Man, I hope I never become as cynical as you. On 2-mrt-2006, at 11:09, Stephen Sprunk wrote:
Why is it even remotely rational that a corporate admin trust 100k+ hosts infested with worms, virii, spam, malware, etc. to handle multihoming decisions?
They trust those hosts to do congestion control too, which is even more important.
Especially when we don't even have a sample of working code today?
The IAB goes out of its way to solicit input on ongoing work, and now you whine about lack of working code?
Now, some may take that as a sign the IETF needs to figure out how to handle 10^6 BGP prefixes... I'm not sure we'll be there for a few years with IPv6, but sooner or later we will, and someone needs to figure out what the Internet is going to look like at that point.
It won't look good. ISPs will have to buy much more expensive routers. At some point, people will start to filter out routes that they feel they can live without and universal reachability will be a thing of the past. It will be just like NAT: every individual problem will be solvable, but as an industry, or even a society, we'll be wasting enormous amounts of time, energy and money just because we didn't want to bite the bullet earlier on.