Modern scrapers will eventually move on. Generating a large amount of AI hallucinations with a crappy LLM will do the maximum damage before it goes. Andrew On Sat, Mar 21, 2026 at 11:59 AM Suresh Ramasubramanian <ops.lists@gmail.com> wrote:
Ron Guilmette / rfg wrote this perl tool he called wpoison back in the 90s that’d feed junk email addresses to spambots.
If a 90s Perl script can do this i seriously doubt we need an LLM of any sort. Why waste all that compute just to give scrapers indigestion?
--srs ------------------------------ *From:* Andrew Kirch via NANOG <nanog@lists.nanog.org> *Sent:* Saturday, March 21, 2026 9:23:00 PM *To:* North American Network Operators Group <nanog@lists.nanog.org> *Cc:* Andrew Kirch <trelane@trelane.net> *Subject:* Re: Correctly dealing with bots and scrapers.
Get a small version of a very old very fast very inaccurate LLM. Have it generate a couple terabytes of endless nonsense.
Redirect scrapers to it, and poison whatever LLM they are trying to train.
Andrew
On Wed, Jul 16, 2025 at 12:49 PM Andrew Latham via NANOG < nanog@lists.nanog.org> wrote:
I just had an issue with a web-server where I had to block a /18 of a large scraper. I have some topics I could use some input on.
1. What tools or setups have people found most successful for dealing with bots/scrapers that do not respect robots.txt for example?
2. What tools for response rate limiting deal with bots/scrapers that cycle over a large variety of IPs with the exact same user agent?
3. Has anyone written or found a tool to concentrate IP addresses into networks for IPTABLES or NFT? (60% of IPs for network X in list so add network X and remove individual IP entries.)
-- - Andrew "lathama" Latham - _______________________________________________ NANOG mailing list
https://lists.nanog.org/archives/list/nanog@lists.nanog.org/message/Z2J6CFBK...
_______________________________________________ NANOG mailing list
https://lists.nanog.org/archives/list/nanog@lists.nanog.org/message/7N5RWJOU...