Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:Destroying Websites? (Score 5, Interesting) 85

As someone who's actively fighting this type of traffic, let me share my perspective.

I have been running a small-ish website with user peaks at around 50 requests per second. Over the last couple of months, my site is getting hit with loads of up to 300 requests per second by these kinds of bots. They're using distributed IPs, and random user agents making it hard to block.

My site has a lot of data and pages to scan, and despite an appropriate robots.txt, these things ignore that and just scan endlessly. My website isn't designed to be for profit, and I do this more or less as a hobby and therefore has trouble handling a nearly 10x increase in traffic. My DNS costs have gone up significantly, with 150 or so million DNS requests being done this month.

The net effect is that my website slows down and gets unresponsive by these scans, and I am looking at spending more money just to manage this excess traffic.

Is it destroying my site? No, not really. But it absolutely increases costs and forces me to spend more money and hours on infrastructure than I would have needed to. These things are hurting smaller communities generating significant cost increases onto those who may have difficulties covering those costs, so calling it bullshit isn't exactly accurate.

Slashdot Top Deals

"A mind is a terrible thing to have leaking out your ears." -- The League of Sadistic Telepaths

Working...