There had been reports of this discover about four years ago -- as I recall, Tony Li reported hearing from customers that traffic levels at the core were remarkably stable. I had some talks with various of the self-similarity experts at the time and they said this was perfectly plausible -- the law of large numbers eventually says this must happen, the self-similarly results simply implied that the amount of traffic required to reach stability was *much* larger than required if traffic was Poisson. At the core, we've apparently reached that point. My recollection of the discussions is that there's a lot of interesting work to be done on the structure of those stable flows (what's going on within the aggregate) as well as working out where the traffic gets small enough to become self-similar. But these discussions were a few years ago and my memory may be faulty. Craig In message <0C875DC28791D21192CD00104B95BFE70146DD09@BGSLC02>, Irwin Lazar writ es:
FYI: Bell Labs recently conducted an analysis of the nature of Internet traffic. Some pretty interesting findings.
http://Networking.ittoolbox.com/browse.asp?c=NetworkingNews&r=/news/dispnews .asp?i=44100
Through the use of sophisticated new software programs that analyzed and simulated data traffic in "unprecedented detail," Bell Labs researchers found that the "burstiness" seen in traffic at the edges of the Internet disappears at the core. The discovery that traffic on heavily loaded, high-capacity network links is unexpectedly regular may point the way to more efficient system and network designs with better performance at lower cost, Bell Labs said.