Mobile app version of vmapp.org
Login or Join
Margaret670

: Securing a server against DDoS like spam runs Periodically my server gets hammered so hard by automated comment spam runs against my blogs that the server goes down and I have to annoy my

@Margaret670

Posted in: #IpAddress #Spam

Periodically my server gets hammered so hard by automated comment spam runs against my blogs that the server goes down and I have to annoy my provider to get a power cycle if I don't see it happening and start blocking IP addresses. I spend hours and hours using netstat commands (copy and paste as suggested by other people) to generate lists of IPs and count of connections. Generally I find that it is a Web Hosting company with a lax attitude to this sort of thing and try and block the entire range in IPTables. If I spend long enough my then things calm down for a while. Then the whole cycle starts up again.

What else can I do to stop spammers bringing my server to it's knees every month or so?

My IPTables block list seems huge and I have even imported massive lists all to no avail. I seem to recall that I installed things like MOD_Evasive and Shields Up way back when (can't really remember) but clearly this is no longer working.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Margaret670

2 Comments

Sorted by latest first Latest Oldest Best

 

@Sue5673885

Putting your server behind Cloudflare can help solving your issues (or at least mitigate most of them) easily. Many features are available for free. It will save you a lot of time and pain.

P.S.: I am not affiliated with Cloudflare in any way.

10% popularity Vote Up Vote Down


 

@Candy875

For small scale things, simple plain DoS attacks, mod-evasive, or limitipconn are both apache modules that would work. I prefer mod-evasive personally, you might want to tighten it's settings after installation. There's good documentation for it here: www.atomicorp.com/wiki/index.php/Mod_evasive
For over-eager site crawlers (Bingbot's the worst mainly), you'll need to set a CrawlDelay in your robots.txt. Documented here: en.wikipedia.org/wiki/Robots_exclusion_standard#Crawl-delay_directive
For DDoS attacks that don't overload your network capacity, but do overload your server's ability to process pages, there's the smart solution, and then there's the cheating, mysiteitsdownohshitfixfixfix solution. Both require you to identify something in the inbound requests that is 'wrong', a useragent, string in the URL, etc.

The smart solution is to install fail2ban, write a regex filter for it, and to configure it to filter your access logs. Your regex should catch whatever you've previously identified as 'wrong'. Fail2ban will periodically check your logs against the regex, and ban anyone (for the configured bantime) from accessing your server entire.

The mysiteisdown solution is to do what fail2ban does, but, using bash, a for loop, tail, grep, cut and awk.

What you need to do, is something along the lines of the following:

COMMAND="tail -n 10 /path/to/access_log | grep 'bad'| awk '{print }'"
for i in $COMMAND; do
route add -host $i gw 127.0.0.1
done


Put that into a .sh script, and run it via cron. Update the -n number for how much logs are being added. Ignore duplicate warnings thrown by route. Slightly less heath-robinson styles will include a grep -v, to remove your IP if it gets added accidentally. Also, logging the ips added to route to a logfile, for later perusal, scripting, or dissemination.

Hopefully this helps.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme