Disclaimer: I often use the M/M/1 queuing assumption for much of my work to keep the maths simple and believe that I am reasonably aware in which context it's a right or a wrong application :). Also, I don't intend to change the core topic of the thread, but since this has come up, I couldn't resist.
>> With 99% load M/M/1, 500 packets (750kB for 1500B MTU) of
>> buffer is enough to make packet drop probability less than
>> 1%. With 98% load, the probability is 0.0041%.
To expand the above a bit so that there is no ambiguity. The above assumes that the router behaves like an M/M/1 queue. The expected number of packets in the systems can be given by
where
is the utilization. The probability that at least B packets are in the system is given by
where B is the number of packets in the system. for a link utilization of .98, the packet drop probability is .98**(500) = 0.000041%. for a link utilization of 99%, .99**500 = 0.00657%.
>> When many TCPs are running, burst is averaged and traffic
>> is poisson.
M/M/1 queuing assumes that traffic is Poisson, and the Poisson assumption is
1) The number of sources is infinite
2) The traffic arrival pattern is random.
I think the second assumption is where I often question whether the traffic arrival pattern is truly random. I have seen cases where traffic behaves more like self-similar. Most Poisson models rely on the Central limit theorem, which loosely states that the sample distribution will approach a normal distribution as we aggregate more from various distributions. The mean will smooth towards a value.
Do you have any good pointers where the research has been done that today's internet traffic can be modeled accurately by Poisson? For as many papers supporting Poisson, I have seen as many papers saying it's not Poisson.