RATE LIMITING THE WHAT AND THE WHY
For those not familiar with rate limiting, it is a protection against a range of Cyber Attacks. Rate limiting works by reducing the number of allowed attempts over a time period. This can provide protection by reducing the risk of Cyber Attacks such as:
- Username / Password Guessing
- Try many times to identify the username using sign-up/forgotten password.
- Try many times to login using a list of known usernames and either potential passwords or every possible password combination.
- Website Cookie Guessing
- Each website cookie has a unique identifier that if guessed would provide them the same access as the username and password.
- Locking accounts
- Account locking can be considered an alternative in some cases to rate limiting, but it has the disadvantage of disabling accounts from working, which could of been the aim of the cyber attack.
- Input flood
- An example could be a website that allows people to post comments. A cyber attacker could then post millions of comments to disrupt other users using the site or overload the server and cause an outage.
- Denial of Service
- There may be a page on a website that places significant load on the server, by loading this page many times it could overload the server and cause an outage.
- Data Breach Extract
- Potential methods for extracting data may require loading a page many times.
RATE LIMITING ARCHITECTURE
Once you have decided to implement rate limiting the question is how and where. Considering the different types of Cyber Attacks that rate limiting impacts, it seems obvious that you would apply rate limiting in most situation for most connections including Active Directory Logins, Websites, Emails, etc.
Let us focus on rate limiting of a website, and drill down in a little more detail.
Although applying rate limiting to an entire website might be a quick fix, it may not take into account the different rates for different pages. As an example applying a general 100 loads a minute per person may work for most pages but it is not restrictive enough for a login page where you may want it limited to 10 login attempts per person an hour.
Therefore to provide the flexibility to rate limit sensitive pages within the code in specific ways but also rate limit all pages would be with a HTTP Reverse Proxy in front of the website. This would be implemented such that the:
- HTTP Transparent Reverse Proxy (e.g. NGINX or VARNISH) to manage rate limiting across the entire website
- Can also use firewall to provide general IP rate limiting such as with IPTABLES
- Code specific rate limiting applied to specific pages e.g. Login, Password Reset, ..
This provides the advantage of:
- Easy to setup and manage as simple rules go into the NGINX server, and complex rules are handled in code.
- Provides two layers of rate limiting defence so any bypass of HTTP Reverse Proxy will still be rate limited.
- Allows rate limiting in code so for sensitive pages rate limiting occurs within the code and stops the code from executing further.
- Provides ability to log rate limiting events at appropriate level to identify for manual review or automatically provide further restrictions.
- This may be important for detecting certain types of data breach attacks.
- Performance impact was less than 1% in our testing setup.
RATE LIMITING IMPLEMENTATION FOR CODE
We found there are a range of already built rate limiting code that can be leveraged to perform rate limiting within your code, however due to the following reasons we decided to build one:
- Needs to be Coded in PHP
- Must have little to no performance impact for the average user
- It needs to be simple (Complexity is an enemy of security)
- The rate limiting code needed to have no extra dependencies that could create further security risks
- This means to work it does not require any special database or memory service (like memcached). This avoids vulnerabilities such as the recent vulnerabilities for memcached ( CVE-2016-8706 and CVE-2017-9951 )
The code is available at GitHub.
For those technically minded we have a couple of tricks including:
- Using inode (File exist, File Date, File Size) information (typically in memory) for most of the cases to maximise performance
- Minimise slow disk usage, by using only fgets() to determine the outcome quickly
- Hashing the input for security to sanitise and limit the length of the input parameters
So go forth and rate limit your resources.
If you need help securing or rate limiting your Server, Website, Email or Active Directory get in contact with us.