Avoiding automatically created traffic by jerks

202 Views Asked by At

I was thinking about web-security and then this thought popped into my head.

Say that there's this jerk who hates me and knows how to program. I am managing a nice website/blog with a considerable amount of traffic. Then that jerk creates a program that automatically request my website over and over again.

So if I am hosting my website on a shared hosting provider then obviously my website will stop responding.

This type of attacks may not be common, but if someone attempts something like that on my website i must do something about it. I don't think that popular CMS's like wordpress or drupal do something about this type of attacks.

My assumption is ;

If a user requests more than x times (let's say 50) in 1-minute, block that user. (stop responding)

My questions are;

  1. Is my assumption ok ? If not what to do about it ?

  2. Do websites like Google, Facebook, Youtube...[etc] do something about this type of attacks.

4

There are 4 best solutions below

0
On

The issue is partly one of rejecting bad traffic, and partly one of improving the performance of your own code.

Being hit with excess traffic by malicious intent is called a Denial of Service attack. The idea is to hit the site with traffic to the point that the server can't cope with the load, stops responding, and thus no-one can get through and the site goes off-line.

But you can also be hit with too much traffic simply because your site becomes popular. This can easily happen overnight and without warning, for example if someone posts a link to your site on another popular site. This traffic might actually be genuine and wanted (hundred of extra sales! yay!), but can have the same effect on your server if you're not prepared for it.

As others have said, it is important to configure your web server to cope with high traffic volumes; I'll let the other answers speak for themselves on this, and it is an important point, but there are things you can do in your own code to improve things too.

One of the main reasons that a server fails to cope with increased load is because of the processing time taken by the request.

Your web server will only have the ability to handle a certain number of requests at once, but the key word here is "simultaneous", and the key to reducing the number of simultaneous requests is to reduce the time it takes for your program to run.

Imagine your server can handle ten simultaneous requests, and your page takes one second to load.

If you get up to ten requests per second, everything will work seamlessly, because the server can cope with it. But if you go just slightly over that, then the eleventh request will either fail or have to wait until the other ten have finished. It will then run, but will eat into the next second's ten requests. By the time ten seconds have gone by, you're a whole second down on your response time, and it keeps getting worse as long as the requests keep pouring in at the same level. It doesn't take long for the server to get overwhelmed, even when it's only just a fraction over it's capacity.

Now imagine the same page could be optimised to take less time, lets say half a second. Your same server can now cope with 20 requests per second, simply because the PHP code is quicker. But also, it will be easier for it recover from excess traffic levels. And because the PHP code takes less time to run, there is less chance of any two given requests being simultaneous anyway.

In short, the server's capacity to cope with high traffic volumes increases enormously as you reduce the time taken to process a request.

So this is the key to a site surviving a surge of high traffic: Make it run faster.

  • Caching: CMSs like Drupal and Wordpress have caching built in. Make sure it's enabled. For even better performance, consider a server-level cache system like Varnish. For a CMS type system where you don't change the page content much, this is the single biggest thing you can do to improve your performance.

  • Optimise your code: while you can't be expected to fix performance issues in third-party software like Drupal, you can analyse the performance of your own code, if you have any. Custom Drupal modules, maybe? Use a profiler tool to find your bottlenecks. Very often, this kind of analysis can reveal that a single bottleneck is responsible for 90% of the page load time. Don't bother with optimising the small stuff, but if you can find and fix one or two big bottlenecks like this, it can have a dramatic effect.

Hope that helps.

3
On

These types of attacks are called (D)DoS (Distributed Denial of Service) attacks and are usually prevented by the webserver hosting your PHP Application. Since apache is used the most, I found an article you might find interesting: http://www.linuxforu.com/2011/04/securing-apache-part-8-dos-ddos-attacks/.

The article states that apache has multiple mods available specifically created to prevent (D)DoS attacks. These still need to be installed and configured to match your needs.

I do believe that Facebook, Google etc. have their own similar implementations to prevent DoS attacks. I know for a fact that Google Search engine uses a captcha if alot of search requests are coming from the same network.

Why it is not wise to prevent DoS within a PHP script is because the PHP processor still needs to be started whenever a request is made, which causes alot of overhead. By using the webserver for this you will have less overhead.

EDIT: As stated in another answer it is also possible to prevent common DoS attacks by configuring the server's firewall. Checking for attacks with firewall rules happens before the webserver is getting hit, so even less overhead there. Furthermore you can detect attacks on other ports aswell (such as portscans). I believe a combination of the 2 works best as both complement each other.

0
On

What you are facing is the DoS.[Denial of Service] Attack. Where one system tries to go on sending packets to your webserver and makes it unresponsive.

You have mentioned about a single jerk, what if the same jerk had many friends and here comes DDoS [Distributed DoS] Attack. Well this can't be prevented.


A Quick fix from Apache Docs for the DoS but not for the DDoS ...

All network servers can be subject to denial of service attacks that attempt to prevent responses to clients by tying up the resources of the server. It is not possible to prevent such attacks entirely, but you can do certain things to mitigate the problems that they create.

Often the most effective anti-DoS tool will be a firewall or other operating-system configurations. For example, most firewalls can be configured to restrict the number of simultaneous connections from any individual IP address or network, thus preventing a range of simple attacks. Of course this is no help against Distributed Denial of Service attacks (DDoS).

Source

0
On

In my opinion, the best way to prevent DoS is to set the firewall to the lower level: at the entry of the server. By settings some network firewall config with iptables, you can drop packets from senders which are hitting too hard your server.

It'll be more efficient than passing through PHP and Apache, since them need to use a lot (relatively) of processus to do the checking and they may block your website, even if you detect your attacker(s).

You can check on this topic for more information: https://serverfault.com/questions/410604/iptables-rules-to-counter-the-most-common-dos-attacks