
On 7/1/25 8:22 PM, Constantine A. Murenin via NANOG wrote:
But the bots are not a problem if you're doing proper caching and throttling.
Not all site traffic is cacheable or can be farmed out to a CDN. Dynamic (especially per-session) requests (think ecommerce) can't be cached. Putting an item into the shopping cart is typically one of the more resource driven events. We have seen bots that will select the buy button and put items into the cart, possibly to see any discounts given. You end up with hundreds of active 'junk' cart sessions on a small site that was not designed for that much traffic. Forcing the bot (or a legit customer) to create yet another login to create a cart can help but that generates push back from the store owner. The owners don't want that until the payment details phase or they want purchasers to be able to do a guest checkout. They will point that on amazon.com you don't have to login to put an item in the cart. Rate limiting is not effective when they come from different ip ranges. The old days of using a Class C (/24) as a rate limit key are no longer effective. The bots come from all over the providers space (often Azure) but can be from any of the larger providers and often from different regions. if you throttle EVERYONE then legit customers can get locked out with 429 or even 503s And has been pointed out. Relying on the browser string is no longer effective. They use common strings and change them dynamically. Sincerely, William Kern PixelGate Networks.