Post: [NGINX] Prevent & Block Layer-7 DDoS
10-18-2015, 08:40 PM #1
Octolus
I defeated!
(adsbygoogle = window.adsbygoogle || []).push({}); I don't know if many of the members here are interested in learning such, or even runs their own servers. However here is some tips, if you have issues with Layer-7 attacks. This is what I found working best.

Sometimes, CloudFlare or other services isn't enough. There are booters like 'YouBoot' that bypasses the 'Attack Mode' in CloudFlare. I've had great success, blocking attacks with up to 800 windows bots.




Limiting Requests
This can be a issue if you're not doing it correctly. Limiting requests, can block normal users from accessing your web site. Therefore, I always limit requests to php documents only.
You can do this by adding something simple as this.

Find location ~ \.php$ { and add limit_req zone=one burst=5; in your nginx.conf, or whatever kind of setup you have. In my scenario, it would end up looking like this: You must login or register to view this content. (I include the PHP, from a separate file called php.conf)

Now add limit_req_zone $binary_remote_addr zone=one:10m rate=5r/s; inside your http { block.

This would, as you may think. Limit the user to max 5 requests a second, this is usually more than a user need to access PHP Documents. If the user make more requests than that, i'd consider it as DDoS.




Disable Error & Access Logs
This can be bad, and good. Good when you are under attack, but bad if someone h4x your website, and you want to figure out what they did. When you are under Layer-7 attack, your logs get massive. You can decrease your CPU Usage with over 50% by disabling access and error logs, from own experience.




Block bad user-agents
To block bad user-agents, simply apply this to your server { block.

Empty User-Agents (Usually Joomla Attacks):
if ($http_user_agent = "") { return 444; }
if ($http_user_agent = " ") { return 444; }
if ($http_user_agent = "-") { return 444; }

WordPress, Joomla, GHP User-Agents:
if ($http_user_agent ~* "PHP|curl|Wget|HTTrack|Nmap|Verifying|PingBack|Pingdom|Joomla|Wordpress") { return 444; }

It will return 444, which will close the connection 'instantly'. This uses barely any resources.

There are some downsides:
- Facebook seems to use empty user-agent to parse your website.
- People connecting to API's, might use empty user-agents.




Fail2Ban & Limit Requests
Perfect combination for botnet attacks, however it requires some processing power and ram. I take no credit for this method, I actually found it You must login or register to view this content. and tweaked it a little.

In /etc/fail2ban, you'll need to create a file called "jail.local" if it's not there already. Then add this content to it:
    [nginx-req-limit]

enabled = true
filter = nginx-req-limit
port = all
action = iptables-allports
logpath = /home/nginx/domains/google.com/error.log
findtime = 1200
bantime = 172800
maxretry = 3


Pretty self-explained. It will scan error.log, using nginx-req-limit filter. If the user reach the request limit three times, it will ban the user for 172800 seconds.

Now create a new file inside here /etc/fail2ban/filter.d/ called nginx-req-limit.conf. It will contain:
    # Fail2Ban configuration file
#
# supports: ngx_http_limit_req_module module

[Definition]

failregex = limiting requests, excess:.* by zone.*client: <HOST>

# Option: ignoreregex
# Notes.: regex to ignore. If this regex matches, the line is ignored.
# Values: TEXT
#
ignoreregex =


This is the regex, for people that reached the request limit in Nginx.

Once done, restart fail2ban (service fail2ban restart).

You can run the following command to see if you have banned any IP's: fail2ban-client status nginx-req-limit

Of course:
- You'll have to make sure that the error.log path is correct.
- You'll have to make sure that it logs failed requests, you might want to change 444 to 403 on bad user-agents to block them too.
- This works on botnet attacks, but can also ban real visitors that have 25 tabs open at once or 25 tabs open with a ajax shoutbox.




You must login or register to view this content.
This is a excellent module to filter out bad traffic, without the need of CloudFlare. This is a nginx module, that will do a simple cookie check, to check if the user can store cookies. In most botnet attacks, and booter attacks - the attacker(s) can't store cookies. If they can't, this module would deny them to access the site.

There are two awesome alternatives:
- Set cookies through headers.
- Set cookies through a static html document.

The most secure one, would be setting through headers. This one allows you to encrypt the cookies, and prevent cookie spoofing. Second one, would also be the most light one, since it's serving from a static document.
Last edited by Octolus ; 10-18-2015 at 08:55 PM.

The following 12 users say thank you to Octolus for this useful post:

Algebra, HamoodDev, Norway-_-1999, ParadoxSPRX, Rath, seb5594, Passion, TheRichSlut, TheFreakyClown, Trefad, Tustin
10-19-2015, 06:22 PM #2
Passion
League Champion
Originally posted by Octolus View Post
I don't know if many of the members here are interested in learning such, or even runs their own servers. However here is some tips, if you have issues with Layer-7 attacks. This is what I found working best.

Sometimes, CloudFlare or other services isn't enough. There are booters like 'YouBoot' that bypasses the 'Attack Mode' in CloudFlare. I've had great success, blocking attacks with up to 800 windows bots.




Limiting Requests
This can be a issue if you're not doing it correctly. Limiting requests, can block normal users from accessing your web site. Therefore, I always limit requests to php documents only.
You can do this by adding something simple as this.

Find location ~ \.php$ { and add limit_req zone=one burst=5; in your nginx.conf, or whatever kind of setup you have. In my scenario, it would end up looking like this: You must login or register to view this content. (I include the PHP, from a separate file called php.conf)

Now add limit_req_zone $binary_remote_addr zone=one:10m rate=5r/s; inside your http { block.

This would, as you may think. Limit the user to max 5 requests a second, this is usually more than a user need to access PHP Documents. If the user make more requests than that, i'd consider it as DDoS.




Disable Error & Access Logs
This can be bad, and good. Good when you are under attack, but bad if someone h4x your website, and you want to figure out what they did. When you are under Layer-7 attack, your logs get massive. You can decrease your CPU Usage with over 50% by disabling access and error logs, from own experience.




Block bad user-agents
To block bad user-agents, simply apply this to your server { block.

Empty User-Agents (Usually Joomla Attacks):
if ($http_user_agent = "") { return 444; }
if ($http_user_agent = " ") { return 444; }
if ($http_user_agent = "-") { return 444; }

WordPress, Joomla, GHP User-Agents:
if ($http_user_agent ~* "PHP|curl|Wget|HTTrack|Nmap|Verifying|PingBack|Pingdom|Joomla|Wordpress") { return 444; }

It will return 444, which will close the connection 'instantly'. This uses barely any resources.

There are some downsides:
- Facebook seems to use empty user-agent to parse your website.
- People connecting to API's, might use empty user-agents.




Fail2Ban & Limit Requests
Perfect combination for botnet attacks, however it requires some processing power and ram. I take no credit for this method, I actually found it You must login or register to view this content. and tweaked it a little.

In /etc/fail2ban, you'll need to create a file called "jail.local" if it's not there already. Then add this content to it:
    [nginx-req-limit]

enabled = true
filter = nginx-req-limit
port = all
action = iptables-allports
logpath = /home/nginx/domains/google.com/error.log
findtime = 1200
bantime = 172800
maxretry = 3


Pretty self-explained. It will scan error.log, using nginx-req-limit filter. If the user reach the request limit three times, it will ban the user for 172800 seconds.

Now create a new file inside here /etc/fail2ban/filter.d/ called nginx-req-limit.conf. It will contain:
    # Fail2Ban configuration file
#
# supports: ngx_http_limit_req_module module

[Definition]

failregex = limiting requests, excess:.* by zone.*client: <HOST>

# Option: ignoreregex
# Notes.: regex to ignore. If this regex matches, the line is ignored.
# Values: TEXT
#
ignoreregex =


This is the regex, for people that reached the request limit in Nginx.

Once done, restart fail2ban (service fail2ban restart).

You can run the following command to see if you have banned any IP's: fail2ban-client status nginx-req-limit

Of course:
- You'll have to make sure that the error.log path is correct.
- You'll have to make sure that it logs failed requests, you might want to change 444 to 403 on bad user-agents to block them too.
- This works on botnet attacks, but can also ban real visitors that have 25 tabs open at once or 25 tabs open with a ajax shoutbox.




You must login or register to view this content.
This is a excellent module to filter out bad traffic, without the need of CloudFlare. This is a nginx module, that will do a simple cookie check, to check if the user can store cookies. In most botnet attacks, and booter attacks - the attacker(s) can't store cookies. If they can't, this module would deny them to access the site.

There are two awesome alternatives:
- Set cookies through headers.
- Set cookies through a static html document.

The most secure one, would be setting through headers. This one allows you to encrypt the cookies, and prevent cookie spoofing. Second one, would also be the most light one, since it's serving from a static document.


Thanks! I use cloudflare for layer 7 attacks but i guess this is better Happy
10-24-2015, 02:40 AM #3
Sloth
Banned
I don't really have a use for this right now since i get 0 traffic but always good to have it here just in case XD
12-17-2015, 08:15 PM #4
gopro_2027
Vault dweller
I have no idea what this does so can someone explain? I could use this for my website.
12-22-2015, 05:42 PM #5
Originally posted by Passion View Post
Thanks! I use cloudflare for layer 7 attacks but i guess this is better Happy


Cloudflare doesn't always block all Layer 7 attack, all Layer 4 ofcourse but Layer 7 is their paid option
12-22-2015, 07:48 PM #6
Passion
League Champion
Originally posted by Eat
Cloudflare doesn't always block all Layer 7 attack, all Layer 4 ofcourse but Layer 7 is their paid option


I use cloudflare & blazingfast.io for my layer7 support aswell as some scripts. Nobody has been able to down my site yet.
12-22-2015, 08:18 PM #7
Originally posted by Passion View Post
I use cloudflare & blazingfast.io for my layer7 support aswell as some scripts. Nobody has been able to down my site yet.


> yet
Kappa

The following user thanked Joren for this useful post:

Jelly
12-22-2015, 08:18 PM #8
Passion
League Champion
Originally posted by Joren View Post
> yet
Kappa


Them feels
12-22-2015, 09:41 PM #9
Octolus
I defeated!
Originally posted by Passion View Post
I use cloudflare & blazingfast.io for my layer7 support aswell as some scripts. Nobody has been able to down my site yet.


BlazingFast is alright, despite their long ass waiting screen. Shouldn't be necessary to use both.

Originally posted by Eat
Cloudflare doesn't always block all Layer 7 attack, all Layer 4 ofcourse but Layer 7 is their paid option


Indeed. However if you set it up properly, you can do a lot with it.

Example, if you have Attack Mode enabled; and there are still attacks going through you could challange the top 10 countries using the Firewall: You must login or register to view this content.
This would prevent booters that has 'CF-BYPASS' to reach your website, also stop a lot of spam

This would bring a challange captcha that the visitors from those countries has to fill.
Last edited by Rath ; 12-23-2015 at 01:50 AM.

The following user thanked Octolus for this useful post:

TheRichSlut
05-01-2016, 10:06 AM #10
Geraxy
Banned
im not sure this will help me but goood tut anyway

Copyright © 2024, NextGenUpdate.
All Rights Reserved.

Gray NextGenUpdate Logo