On Tue, 25 Oct 2005 16:28:05 -0000, "Christopher L. Morrow" said:
On Mon, 24 Oct 2005, Blaine Christian wrote:
Yea, but that's just me pinging everything and google and yahoo fighting over who has the most complete list of x rated sites.
and this probably depends greatly on the network, user-population, business involved. Is it even a metric worth tracking?
It's a fight for eyeballs, isn't it? Routing table hits caused by spidering from search engines will give a good indication of what percent of the address space the spiders are covering. Of course, you need views from a number of places, and some adjusting for the fact that the webservers are usually clumped in very small pockets of address space. On the other hand, if it can be established that 80% of the routing table is hit every <N minutes>, which would tend to argue against caching a very small subset, but the vast majority of the routing table hits are just spiders, that may mean that a cache miss isn't as important as we thought... Anybody got actual measured numbers on how much of the hits are just spiders and Microsoft malware scanning for vulnerable hosts?