
On Sun, Jul 6, 2025, at 10:47 AM, William Herrin via NANOG wrote:
On Sun, Jul 6, 2025 at 7:11 AM Rich Kulawiec via NANOG <nanog@lists.nanog.org> wrote:
We can't have nice things generously built for the common good any more because there are too many selfish and greedy thugs who don't care about anything except their own wealth, power, and egos.
I've wondered if it'd work to place invisible links in the page and then block the source for while any time one of the invisible links is clicked. Just the classic one-pixel transparent graphic but with a link that the log reaper understands to mean "bot was here."
Haven't tried it. Nearly all of my content is static so I don't have enough of a crawler problem to bother.
For the crawler mitigations I've personally been involved with, this would not have worked. The source IPs numbered at least in the tens of thousands. They didn't repeat; each source IP made one request and was never seen again. They didn't aggregate together into prefixes we could filter. They didn't use any common identifier we could find to filter on, including the user-agent (which were valid-looking and randomized). In my case, we were able to simply put 99% of the site behind a login, which mitigated the problem. Many sites don't have that option.. Dan