The question you pose cannot be answered as succinctly as you may think. Various factors contribute to packet loss and latency - many of which are completely unrelated. For example, if you purchase ATM CBR connectivity from NY to Australia, you will see high latency, but (assuming you are within your trffic contract) no packet loss. Here the latency is due to propogation delay. Then again, you may have GigE connectivity across the data center floor to a switching cluster supporting a group of application servers that cannot handle the load being delivered by the network. Here, there isn't much latency, but there may be significant packet loss. Also note that the traditional Poisson distribution that early queueing theory was founded on is not always the best model for todays Internet. Traffic patterns at exchange points are becoming more and more fractal in nature, and new queuing technologies such as WRED coupled with rate limiting technologies such as CAR, take your assumptions about available bandwidth, switching latency, and queue/buffer depth and flush them down the drain. The point is, this is a difficult question to answer. If I were to make a VERY general assumption, however, I may be persuaded to agree with your proposed correlation, although many would find objection to this with evidence to substantiate their claim. Regards, Chris -----Original Message----- From: jzeeff@whs.verio.net [mailto:jzeeff@whs.verio.net] Sent: Wednesday, January 13, 1999 4:18 PM To: nanog@merit.edu Subject: latency vs. packet loss How well does latency correlate to packet loss on the Internet? For example, if one were to pick one of several randomly placed sites on the net based on lowest latency to/from point x, what percentage of this time would this also yield the site with the lowest packet loss to/from point x? My guess is that the correlation is high (due to typical buffer sizes).