ASL Scenario Archive is down

Pacman Ghost

Member
Joined
Feb 25, 2017
Messages
347
Reaction score
188
Location
A maze of twisty little passages, all alike
Country
llAustralia
I need limit the actual number of requests per minute from any given source (to somewhere around 30-60 I'd think).
Yes, that's what I meant :) Limit the rate at which requests are accepted, rejecting anything over that. 30 is perhaps a bit high, you want to allow humans but block bots, and there's no way a human would be submitting a request every 2 seconds.

I use shared hosting, so I can't install new mods, but it's easy to figure it out if an excessive number of requests are coming from the same IP address or user agent, and then block it in .htaccess. It's a lot simpler and might be enough in your case.
 

Sparafucil3

Forum Guru
Joined
Oct 7, 2004
Messages
9,883
Reaction score
2,861
Location
USA
First name
Jim
Country
llUnited States
I use shared hosting, so I can't install new mods, but it's easy to figure it out if an excessive number of requests are coming from the same IP address or user agent, and then block it in .htaccess. It's a lot simpler and might be enough in your case.
I used to automate updates to .htaccess through scripting by tailing the logs. Easier to to behind a load balancer though. :) -- jim
 

Pacman Ghost

Member
Joined
Feb 25, 2017
Messages
347
Reaction score
188
Location
A maze of twisty little passages, all alike
Country
llAustralia
I used to automate updates to .htaccess through scripting by tailing the logs.
I did that as well, to block crawlers that weren't honoring the robots file. I had a couple of hidden links on the main page that were disallowed in robots.txt, and if someone tried to get those URL's, their IP got added to the .htaccess. I was always terrified there was a bug in there and accidentally locking everyone out... :rolleyes:
 

daveramsey

Senior Member
Joined
Jun 10, 2006
Messages
1,616
Reaction score
481
Location
Hertfordshire
First name
Dave
Country
llUnited Kingdom
Right, another day, another bunch of attacks. For those interested in these things, fail2ban is now running, and "appears" to be doing a solid job. I'll keep my eyes on it, but let's see how this performs.

I'm removing the reboots from the cron job, to see how long I can keep the server up for now.
 

JRKrejsa

Elder Member
Joined
Sep 21, 2005
Messages
3,428
Reaction score
851
Location
USA
Country
llUnited States
Right, another day, another bunch of attacks. For those interested in these things, fail2ban is now running, and "appears" to be doing a solid job. I'll keep my eyes on it, but let's see how this performs.

I'm removing the reboots from the cron job, to see how long I can keep the server up for now.
Thanks for all you do.
 

daveramsey

Senior Member
Joined
Jun 10, 2006
Messages
1,616
Reaction score
481
Location
Hertfordshire
First name
Dave
Country
llUnited Kingdom
Of course, now I've set the threshold too low. I managed to ban myself, along with a bunch of you guys, for using the archive too much :) That's now resolved.

One of the issues here is knowing how many requests per 30 seconds is reasonable. Note - that on the home page alone there's about 20 requests for icons, images, etc if you don't already have them cached.

Anyway - for anyone interested, my fail2ban config is available. Currently banning anyone trying more than 200 requests in 30 seconds. DB has 4000 available connections, no (apparent) sleepy connections and a bunch of free workers available on the webserver. I've got my eye on it, so if it does go down again today I'll be able to take a look.
 
Top