dip wrote:
I have seen cases where traffic behaves more like self-similar.
That could happen if there are small number of TCP streams or multiple TCPs are synchronized through interactions on bloated buffers, which is one reason why we should avoid bloated buffers.
Do you have any good pointers where the research has been done that today's internet traffic can be modeled accurately by Poisson? For as many papers supporting Poisson, I have seen as many papers saying it's not Poisson.
It is based on observations between 1989 and 1994 when Internet backbone was slow and the number of users was small, which means the number of TCP streams running in parallel is small. For example, merely 124M packets for 36 days of observation [LBL-1], is slower than 500kbps, which can be filled up by a single TCP connection even by computers at that time and is not a meaningful measurement.
https://www.cs.wustl.edu/~jain/cse567-06/ftp/traffic_models2/#sec1.2
It merely states that some use non Poisson traffic models. Masataka Ohta