Yes, that's what I meant Limit the rate at which requests are accepted, rejecting anything over that. 30 is perhaps a bit high, you want to allow humans but block bots, and there's no way a human would be submitting a request every 2 seconds.I need limit the actual number of requests per minute from any given source (to somewhere around 30-60 I'd think).
I used to automate updates to .htaccess through scripting by tailing the logs. Easier to to behind a load balancer though. -- jimI use shared hosting, so I can't install new mods, but it's easy to figure it out if an excessive number of requests are coming from the same IP address or user agent, and then block it in .htaccess. It's a lot simpler and might be enough in your case.
I did that as well, to block crawlers that weren't honoring the robots file. I had a couple of hidden links on the main page that were disallowed in robots.txt, and if someone tried to get those URL's, their IP got added to the .htaccess. I was always terrified there was a bug in there and accidentally locking everyone out...I used to automate updates to .htaccess through scripting by tailing the logs.
Thanks for all you do.Right, another day, another bunch of attacks. For those interested in these things, fail2ban is now running, and "appears" to be doing a solid job. I'll keep my eyes on it, but let's see how this performs.
I'm removing the reboots from the cron job, to see how long I can keep the server up for now.