True, one can certainly think of and even find many situations where they don't correlate, but in real world measurements, it looks like in perhaps 90% of the cases, packet loss between randomly choosen places on the Internet is accompanied by greater than typical latency. I suppose this suggests that saturated links (where the router/switch adds latency by buffering) are a commmon cause of Internet packet loss (vs line errors, etc which would not show this correlation). As a single example, connectivity from here to a popular NSP web site is 40 msec in the morning with no packet loss and 500 msec with 20% packet loss in the afternoon. Routing is the same (through mae-east :-)).
How well does latency correlate to packet loss on the Internet? For example, if one were to pick one of several randomly placed sites on the net based on lowest latency to/from point x, what percentage of this time would this also yield the site with the lowest packet loss to/from point x? My guess is that the correlation is high (due to typical buffer sizes).
Remember latency is also affected by other things, like distance (you won't get less than 70ms RTT NY<->Lon even on an empty STM-1),
Also note there are some conditions which cause packet loss which won't cause ICMP latency (line errors, various IXP overload conditions