
I fight too, but im slowly loosing the battle. Luicky, its just my private stuff. I run distributed FW to guard all my server at once: fwcli> stats rules 946 rules in DB All subnets are /24 or bigger.. And they keep coming... I slowly start to think that its time to pull my basic services out of Internet.. ---------- Original message ---------- From: Andrew Latham via NANOG <nanog@lists.nanog.org> To: North American Network Operators Group <nanog@lists.nanog.org> Cc: Andrew Latham <lathama@gmail.com> Subject: Correctly dealing with bots and scrapers. Date: Wed, 16 Jul 2025 10:48:39 -0600 I just had an issue with a web-server where I had to block a /18 of a large scraper. I have some topics I could use some input on. 1. What tools or setups have people found most successful for dealing with bots/scrapers that do not respect robots.txt for example? 2. What tools for response rate limiting deal with bots/scrapers that cycle over a large variety of IPs with the exact same user agent? 3. Has anyone written or found a tool to concentrate IP addresses into networks for IPTABLES or NFT? (60% of IPs for network X in list so add network X and remove individual IP entries.) -- - Andrew "lathama" Latham - _______________________________________________ NANOG mailing list https://lists.nanog.org/archives/list/nanog@lists.nanog.org/message/Z2J6CFBK...