Skip to content

Instantly share code, notes, and snippets.

@jesselawson
Last active February 12, 2022 15:07
Show Gist options
  • Save jesselawson/e91a6cf28ccd79553a44bdda5208c473 to your computer and use it in GitHub Desktop.
Save jesselawson/e91a6cf28ccd79553a44bdda5208c473 to your computer and use it in GitHub Desktop.
Automatically Block Banned IPs with fail2ban, iptables, and ipset

In this tutorial, we'll develop a script that will get all the IP addresses blocked by fail2ban on the ssh chain and then add them to an ipset that will be automatically blocked by iptables. Talk about power traffic management!

One of the most frustrating parts about running a web hosting company is the exposure to spam and bad bot traffic. At my old hosting company, I often had to scrub through IP logs to determine what traffic was legitimate and what traffic should have been blocked outright. In fact, more web hosting companies could do this but they choose not to because more traffic = more money.

If you've read any of my other tutorials, you know I have a folder full of tools that I developed to help manage my servers. One of them is an automatic bad bot and spam traffic filter script, which I call scrub_fail2ban.sh. This script will perform three primary functions:

  1. Build a list of IPs to scrub based off of the IPs blocked by fail2ban -- specifically, this script will create a list of IPs added by fail2ban to the fail2ban-ssh chain. These are IPs that failed to login via SSH; since I don't allow SSH access to my clients, I know exactly what IPs should be accessing my boxes.

  2. Add the offending IPs to an ipset, which is a very fast, very efficient way of blocking tens of thousands of IP addresses at a time.

  3. Restart the fail2ban service so that the network is not bogged down by too many entries in the iptables' raw DROP chain.

Here's the code:

#!/bin/bash

# This script will spit out all of the IPs that have been blocked by fail2ban-ssh, 
# then for each one, add it to our `ipset blacklist`. It will then restart fail2ban 
# to flush the fail2ban-ssh drop chain. 

# Build the ipset if it's not already built
ipset create blacklist hash:ip

# Build a list of IPs to scrub
iptables -L fail2ban-ssh -v -n | grep -E '[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}' | awk '{print $8}' > blockthese.txt

# Read that list into an array
declare -a scrublist
readarray -t scrublist > blockthese.txt

# loop through each IP address in the array
for i in "${scrublist[@]}"
do
        # Add that IP to the blacklist
        echo -e "Adding $i to blacklist...\n"
        ipset add blacklist $i
done

echo -e "All finished."

# Delete the temporary file we just made
rm blockthese.txt

# Just in case, we'll reissue the iptables rule to drop all from the blacklist set
iptables -I INPUT -m set --match-set blacklist src -p TCP --destination-port 80 -j DROP

# Now we'll restart fail2ban
sudo service fail2ban restart

# And that's it!

You'll notice that this code will attempt to build the ipset regardless of whether or not it exists. It does the same thing with the iptable DROP entry for the blacklist ipset. Both of these are failsafes and will not affect anything if they throw an error.

To make this run nightly, go to crontab (crontab -e [hit enter]) and add an entry like this:

30 2 * * * /var/your_tools/scrub_fail2ban.sh

Good luck with your traffic management!

-Jesse

Periodically, log files should be scrubbed for bad bots and malicious IPs. Let's do that.

This will work with any log file in which you have one IP address per line.

The first thing we want to do is pull out all the IP addresses in our log file. Let's go ahead and do that for one of my SaaS's log files right now:

grep -E -o '(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)' myapp.access.log >> ipslist.txt

Make sure you stick all that on a single line!

Unfortunately, if you cat ipslist.txt you'll see a bunch of duplicate entries. To get rid of them, let's sort the file:

sort -u ipslist.txt >> ipslist

Now remove our old file with rm -rf ipslist.txt because we don't need it anymore.

Go ahead and cat ipslist to see the list of unique IPs.

With this, you can use the methods described at the end of my preventing_spam_and_bad_bot_traffic tutorial in which we create a custom php script to scrub through the IPs and automatically print us out some deny from entries formatted for Nginx.

Alternatively, you could go straight to iptables (which is what I do) and just block them at the server level:

BLOCK_THIS_IP="x.x.x.x"
iptables -A INPUT -s "$BLOCK_THIS_IP" -j DROP
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment