Recently one of my clients websites was under a DDOS attack and this resulted in a less-then-desirable site performance. In this blog post I will elaborate which precautions we did to make certain our site will be performing as requested (and not scaling to 10 instances costing us lots of $$).
Determine you have an issue
We started getting emails from Raygun that we were receiving timeouts on our website. After logging into the Azure Portal we came home to find this:
ADD IMAGE Downtime
Further exploration of Application Insights yielded the following insight in our traffic during the night + exceptions:
IMAGE Exceptions everywhere
It was clear that we somehow had gotten a whole load of requests during the night which resulted in a unresponsive website.
First we evaluated our configuration for scaling of instances because we had expected it to scale our instances up instead of crashing. In the previous Azure management portal you could only configure scaling by memory usage. The new portal allows other metrics to be also an indicator for scaling up and down:
- Http Queue length
- Cpu usage
- Traffic inbound
- Traffic outbound
So we proceeded to configure scaling by http queue length instead of memory because our application almost never reaches the memory limits.
Finding the cause
We used the logs supplied by Application Insights to determine if we could find a common denominator in the large amount of requests coming to us.
After some digging we found out that we recieved a lot of traffic with an identical (outdated) User Agent -
Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1):
A simple [Google search](https://www.google.nl/search?q="Mozilla/4.0+(compatible%3B+MSIE+6.0%3B+Windows+NT+5.1%3B+SV1"&tbs=qdr:m&cad=h) for this query string confirmed our suspicion; we were being DDoS’d by the PushDo botnet!
Shutting out requests by User Agent
Because the User Agent string used by the DDoS is really old (Windows XP SP2 + IE6.0) we decided to block it from accessing our website. We did this by applying the requestFilter config to our IIS website.
<system.webServer> <security> <requestFiltering> <filteringRules> <filteringRule name="Block Pushdo DDOS User Agent" scanUrl="false" scanQueryString="false"> <scanHeaders> <clear /> <add requestHeader="User-Agent" /> </scanHeaders> <appliesTo> <clear /> </appliesTo> <denyStrings> <clear /> <add string="Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)" /> </denyStrings> </filteringRule> </filteringRules> </requestFiltering> </security> <system.webServer>
This effectively blocks visitors with the given user agent from visiting our website.
Throttling requests per unique visitor IP
As an additional bonus we also configured throttling of concurrent requests per unique ip to prevent other types of DDoS attacks.
<system.WebServer> <security> <dynamicIpSecurity> <!-- max 20 requests within 500 milliseconds allowed --> <denyByRequestRate enabled="true" maxRequests="20" requestIntervalInMilliSeconds="500" /> <!-- max 10 requests in flight at a time --> <denyByConcurrentRequests enabled="true" maxConcurrentRequests="10" /> </dynamicIpSecurity> </security> </system.WebServer>
In the above example, setting the enabled attribute to true in the denyByRequestRate element tells IIS to block requests from IP addresses when the total number of requests observed within the time window defined by requestIntervalInMilliseconds (set to 2000 ms. in the example) exceeds the value set in the maxRequests attribute (set to 10 in the example). So a client making more than 10 requests within a 2 second period will be blocked.