Skip to content

Instantly share code, notes, and snippets.

@KrisLowet
Last active September 23, 2025 17:38
Show Gist options
  • Select an option

  • Save KrisLowet/675ba34e682c6d2afbc53fc317b41e85 to your computer and use it in GitHub Desktop.

Select an option

Save KrisLowet/675ba34e682c6d2afbc53fc317b41e85 to your computer and use it in GitHub Desktop.
#!/bin/bash
#
# Check if required commands are installed.
#
# A list of all commands required by this script.
required_commands=( "parallel" "dig" "wget" "sort" "ping" )
# Loop through the list and check each command.
for cmd in "${required_commands[@]}"; do
if ! command -v "$cmd" &> /dev/null; then
echo "Error: Required command '$cmd' is not found or not executable." >&2
echo "Please ensure it is installed and in the system's PATH." >&2
exit 1
fi
done
#
# Define files and download the lists.
#
# File names of the hosts list and the result.
hosts_list='dnsblock_hosts_'$(date '+%Y%m%d')'.txt' # One FQDN per line in file.
result='dnsblock_result_'$(date '+%Y%m%d')'.csv' # CSV result file.
# Download the most recent hosts lists and combine to one file.
echo "### Downloading hosts lists."
wget --quiet -O "dnsblock_test_list_1.txt" "https://malware-filter.gitlab.io/malware-filter/urlhaus-filter-hosts.txt" # https://gitlab.com/malware-filter/urlhaus-filter
wget --quiet -O "dnsblock_test_list_2.txt" "https://hole.cert.pl/domains/v2/domains.txt" # https://cert.pl/en/warning-list/
cat dnsblock_test_list_1.txt dnsblock_test_list_2.txt > dnsblock_test_list_concat.txt
# Clean up the list.
sed -i '/^[[:blank:]]*#/d;s/#.*//' dnsblock_test_list_concat.txt # Remove comments.
sed -i 's/^0.0.0.0 //; /^#.*$/d; /^ *$/d; s/\t*//g' "dnsblock_test_list_concat.txt" # Remove '0' IPs, comments, empty space lines, and tabs.
sort dnsblock_test_list_concat.txt | uniq > $hosts_list # Create a list with only unique hosts.
totalhosts=$(wc -l < $hosts_list)
echo "### Hosts to test: $totalhosts"
#
# Define IP address of the nameservers used for lookups.
# Keep the correct order for the resolver name and IP address.
#
# Resolver names.
ns_sp_array=(
'Cloudflare unfiltered'
'ControlD Malware'
'DNS4EU Protective resolution'
'UltraDNS Threat Protection'
'Quad9'
'Cloudflare for Families'
'dns0.eu'
'CleanBrowsing Security Filter'
)
# Resolver IP addresses.
ns_ip_array=(
'1.1.1.1'
'76.76.2.1'
'86.54.11.1'
'156.154.70.2'
'9.9.9.9'
'1.1.1.2'
'193.110.81.0'
'185.228.169.9'
)
# Generate the header for the CSV file.
header="Domain name"
for i in "${!ns_sp_array[@]}"; do
header="$header,${ns_sp_array[$i]} - ${ns_ip_array[$i]}"
done
echo "$header" > "$result"
#
# Take the average ping to the nameservers.
#
echo "### Checking average ping to nameservers."
ping_results="PING (ms)"
for ns_ip in "${ns_ip_array[@]}"; do
avg_ping=$(ping -c 5 -W 1 -q "$ns_ip" | grep '^rtt' | cut -d ' ' -f 4 | cut -d '/' -f 2)
ping_results="$ping_results,${avg_ping:--}"
done
echo "$ping_results" >> "$result"
#
# Define a function that does the lookup and filter blackholes.
#
dig_and_filter() {
local ns_ip=$1
local domain_to_test=$2
# Execute dig.
ip=$(dig @"$ns_ip" +noadflag +noedns +short "$domain_to_test" | grep '^[.0-9]*$' | tail -n1)
# Filter out unwanted IP addresses and return an empty result if necessary.
# These are the blackhole IP addresses used by the DNS resolvers.
# 127.0.0.1 AND 0.0.0.0 = General
# 51.15.69.11 = DNS4EU
# 156.154.112.16 and 156.154.113.16 = UltraDNS
if [[ "$ip" == "127.0.0.1" || "$ip" == "0.0.0.0" || "$ip" == "51.15.69.11" || "$ip" == "156.154.112.16" || "$ip" == "156.154.113.16" ]]; then
echo ""
else
echo "$ip"
fi
}
export -f dig_and_filter # Export the function so that GNU Parallel can use it.
#
# Test a list of safe hosts to ensure that the nameservers are responding well.
#
echo "### Testing safe hosts to ensure nameservers are responding."
safe_hosts=( nexxwave.be nasa.gov google.com cloudflare.com microsoft.com )
for domain in "${safe_hosts[@]}"; do
echo "Testing $domain (safe domain)..."
results_line="$domain (safe domain)"
for ns_ip in "${ns_ip_array[@]}"; do
ip=$(dig_and_filter "$ns_ip" "$domain")
results_line="$results_line,$ip"
done
echo "$results_line" >> "$result"
done
#
# Start the parallel test.
#
echo -e "\n### Start parallel test of hosts at $(date)"
echo -e "\n" >> "$result";
echo "$header" >> "$result"
while IFS= read -r domain
do
# Skip any blank lines in the list.
if [ -z "$domain" ]; then continue; fi
# Provide feedback on the console that we are testing this domain.
# Use -n to not print a newline, so that the 'Skipping' message can appear on the same line.
echo -n "Testing $domain ..."
# First test with the unfiltered resolver (the first on the list).
ip0=$(dig_and_filter "${ns_ip_array[0]}" "$domain")
# Only proceed if the domain has a valid A record on the unfiltered resolver.
if [[ -n "$ip0" ]]; then
echo " OK"
# Make the domain name 'safe' for use in the parallel subshell.
safe_domain=$(printf %q "$domain")
# Run the test for the testing resolvers at the same time.
testing_ips=$(printf "%s\n" "${ns_ip_array[@]:1}" | grep . | parallel -j 8 --keep-order "dig_and_filter {} $safe_domain")
# Convert the newline separated output to comma separated.
testing_ips_csv=$(echo "$testing_ips" | paste -sd, -)
# Write the complete line, including the result of the first resolver, to the result file.
echo "$domain,$ip0,$testing_ips_csv" >> "$result"
else
# If ip0 is empty, give feedback and skip the rest.
echo " Skipping"
fi
done < "$hosts_list"
echo "### End test of hosts at $(date)"
echo "### Result file created at: $result"
@WFANG12719
Copy link

This is a great script and good to know the result. (from Eric Sauvageau's retweet). I had a quick test of it on my router, and I found each domain took about 25 seconds to complete. So, roughly it would take me about 38 days to run the whole list. If it could be optimized to adapt the multi-threaded environment, it will significantly improve the efficiency.

@KrisLowet
Copy link
Author

25 seconds for 10 DNS requests is a long time. Maybe you can choose only the DNS resolvers of your interest and not all of them.

@WFANG12719
Copy link

Thanks, Kris. Yes, I selected some and add another two. BTW, I made some changes on this script and realized "multi thread" by using the Linux File Descriptor (FD) . I tested it on my router ( 4* ARMv8 ,2G RAM) with 12 DNS servers and it could finish within 6-8 hours. I tried 100 sessions on my router, it worked well. (first, I tested 20 threads, then 50, then 80, 100, CPU was about 80% but RAM usage was 40%)

@SkewedZeppelin
Copy link

always nice to see different approaches, I made a similar script recently
https://github.com/divestedcg/circumnavigator/blob/master/test.sh
https://divested.dev/misc/circumnavigator.txt

@WFANG12719
Copy link

@SkewedZeppelin , great!

@KrisLowet
Copy link
Author

I've updated the script with a parallel test. It improves the speed by a factor of 4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment