On Thu, 7 Jun 2001, Craig Partridge wrote:
I had some talks with various of the self-similarity experts at the time and they said this was perfectly plausible -- the law of large numbers eventually says this must happen, the self-similarly results simply implied that the amount of traffic required to reach stability was *much* larger than required if traffic was Poisson. At the core, we've apparently reached that point.
I attended a seminar where self-similarty people from Ericsson were talking (At IETF INET2001). They had not tested the theory with thousands of TCP connections on high capacity links, but one thing that caught my eye was that for self similarity to occur, congestion need to exist (according to them). Well, if you have congestion in your core, you're doing something wrong. Fix it, and the problem goes away (at least in the core). Same thing with end-to-end QoS that some people were talking about there (for some reason, they all seemed to originate from the telephony world, god knows why *smirk*), you only need QoS in the core if you have congestion, and then you're only throwing manhours at the problem instead of buying hardware and more capacity. I'd go for overprovisioning every day of the week. -- Mikael Abrahamsson email: swmike@swm.pp.se