Skip to content

Instantly share code, notes, and snippets.

@hsiboy
hsiboy / prepareforsiege.sh
Created May 6, 2014 09:51
Prepares IIS W3C log files for use with siege
#!/bin/bash
#
# Takes IIS W3C formatted log files, and creates a load file for use with siege.
#
#
# make sure you always put $f in double quotes to avoid any nasty surprises i.e. "$f"
for f in $@
logfile = siege-result.csv
verbose = true
timeout = 122
show-logfile = false
logging = true
protocol = HTTP/1.1
chunked = true
cache = false
connection = close
concurrent = 15
@hsiboy
hsiboy / PrepareForSiege.sh
Created May 6, 2014 10:46
prepare IIS W3C log files for use by siege - concatenate cs-uri-stem and the cs-uri-query
#!/bin/bash
# make sure you always put $f in double quotes to avoid any nasty surprises i.e. "$f"
for f in $@
do
if [ ! -f $f ];
then
echo "File ($f) not found!"
exit 0
@hsiboy
hsiboy / akamai.log
Created May 7, 2014 13:21
read a list of hosts and see if they use akamai
www.about.com akadns.net
www.adobe.com
www.amazon.com
www.amazon.co.uk
www.aol.com akadns.net
www.aol.co.uk akadns.net
www.apple.com akadns.net
www.argos.co.uk
www.asda.co.uk
www.ask.com edgesuite.net
@hsiboy
hsiboy / check404s.sh
Created May 7, 2014 13:31
check a bunch of pages
#!/bin/bash
NOWD=$(date +"%Y-%m-%d")
URLS="
http://www.domain.com/page1.html
http://www.domain.com/page2.html
http://www.domain.com/pagen.html
http://www.domain.com/and/so/on.html
"
@hsiboy
hsiboy / RefererCheck.pl
Last active August 29, 2015 14:01
Zeus Traffic Manager - Bandwidth Management and Rate Shaping (based on referer)
# Automatically limit all requests from any referer.
# Uses two rate classes, BusyReferer for those sites that send a large amount of traffic
# and StandardReferers for those that don't.
# Referer whitelist. These referers are never rate limited.
$whitelist = "localhost 172.16.121.100";
# Referers that are allowed to pass a higher number of clients.
$highTraffic = "google mypartner.com";
@hsiboy
hsiboy / SearchLimit.pl
Last active August 29, 2015 14:01
Zeus Traffic Manager - Dynamic rate shaping for slow applications
# implement the following policy:
# if transactions complete within 50 ms, do not attempt to shape traffic.
# if transactions take more than 50 ms, assume that we are in danger of overload and Rate-limit traffic to 100 requests per second, and if requests exceed that rate limit, send back a '204 No Content' message rather then queuing them.
# Once transaction time comes down to less than 50ms, remove the rate limit.
# Create a rate shaping class named "Searchlimit" with a max_rate_per_second: 100
# Create a Service Level Monitoring class named SearchTimer with a response_time of 500 ms.
@hsiboy
hsiboy / BotBuster.md
Last active July 26, 2023 20:57
Bot-Buster™ - Tracks nefarious activity on website, and manages accordingly.

Bot-Buster™

Tracks nefarious activity on website, and manages accordingly.

It's probably a bot.

If the requesting entity:

  • declares its user-agent as being wget, curl, webcopier etc - it's probably a bot.
  • requests details -> details -> details -> details ad nauseum - it's probably a bot.
  • requests the html, but not .css, .js or site furniture - it's probably a bot.
@hsiboy
hsiboy / fetch.sh
Created June 3, 2014 09:53
fetch a url a number of times, spit out csv
#!/bin/bash
echo -n "url to measure: "
read url
URL="$url"
echo -n "How many times to fetch? "
read loop
NOWD=$(date +"%Y-%m-%d")
clean_var()
{
INVAR=$1