Skip to content

Instantly share code, notes, and snippets.

@cihanmehmet
Last active March 26, 2024 03:12
Show Gist options
  • Star 79 You must be signed in to star a gist
  • Fork 27 You must be signed in to fork a gist
  • Save cihanmehmet/5d7f6d6514b4c1c54c00ebf36d5f9e81 to your computer and use it in GitHub Desktop.
Save cihanmehmet/5d7f6d6514b4c1c54c00ebf36d5f9e81 to your computer and use it in GitHub Desktop.
Subdomain Wordlist

⏳🔺33 Million Subdomain Wordlist🔻🧱🔨👀

cmd@fb:/tmp|❯ wc -l 33m-subdomain-wordlist.txt
 33927885 33m-subdomain-wordlist.txt

🚨🔺15 Million Subdomain Wordlist🔻 🧱🔨👀

cmd@fb:/tmp|wc -l 15m_sub_wordlist.txt
 15677820 15m_sub_wordlist.txt

11m_sub_wordlist.txt Download Link

cmd@fb:~/Desktop|⇒  wc -l 11m_sub_wordlist.txt
11466866 11m_sub_wordlist.txt

11m_sub_wordlist.txt 137 MB

image

Subdomain Brute Force

cat 11m_sub_wordlist.txt | subgen -d "bing.com"> not_resolve_subdomain.txt
echo "bing.com">bing.txt
comb -s="." 11m_sub_wordlist.txt bing.txt > not_resolve_subdomain.txt
cat not_resolve_subdomain.txt | zdns A --threads 10000 --name-servers=1.1.1.1 | jq -r "select(.data.answers[0].name) | .name" |tee resolve_subdomain.txt

ZDNS Too Many open files Error Solution

ulimit -n 100000

Aiodnsbrute

sudo pip3 install aiodnsbrute
aiodnsbrute yahoo.com -w /tmp/11m_sub_wordlist.txt -o csv -t 10000 -r resolver.txt

resolver.txt=>1.1.1.1

cat yahoo.com.csv | cut -d, -f1 | grep yahoo.com>yahoo.com.txt

Subdomain+Bruteforce List Create

for i in $(cat 11m_sub_wordlist.txt); do echo $i".bing.com">>not_resolve_subdomain.txt; done   #first method
cat 11m_sub_wordlist.txt | xargs -L1 -P20 -I@ bash -c "echo @.'bing.com'>>not_resolve_subdomain.txt"   #second method
cat 11m_sub_wordlist.txt | awk '{print $1".bing.com"}'>not_resolve_subdomain.txt   #third method
while read -r sub; do echo $sub".bing.com">>not_resolve_subdomain.txt ; done < 11m_sub_wordlist.txt   # fourth method
echo "bing.com">bing.txt ; comb -s="." 11m_sub_wordlist.txt bing.txt > not_resolve_subdomain.txt   #fifth method
cat 11m_sub_wordlist.txt | subgen -d "bing.com">not_resolve_subdomain.txt    #sixth method
goaltdns -h bing.com -w /tmp/11m_sub_wordlist.txt -o not_resolve_subdomain.txt    #seventh method

List Bruteforce Create

cat 3.txt
bing.com
tesla.com
google.com
while read -r sub; do awk '{print $1".'$sub'"}' ~/a/10k.txt >> $sub_non_resolve.txt; done < cat 3.txt

Ksubdomain and Zdns Alive Subdomain

cat resolve.txt
1.1.1.1
8.8.8.8
sudo ksubdomain -f not_resolve_subdomain.txt -e 0 -verify -o resolve.txt -s resolve.txt

cat not_resolve_subdomain.txt | zdns A --threads 10000 --name-servers=1.1.1.1 | jq -r "select(.data.answers[0].name) | .name" |tee resolve_subdomain.txt

Simple Brute-Force Bash Functions

#!/usr/bin/env bash

if [ "$#" -ne 1 ]; then
    echo "brute.sh domain.com"
        exit 1
fi
bruteforce(){
    domain=$1
    wordlist="/tmp/1000.txt"
    #wordlist=$2
    printf "[*] Bruteorce $domain Start"
    printf "                        \r"
    cat $wordlist | awk -v url="$domain" '{print $1"."url}' > /tmp/notresolve-$domain.txt
    cat /tmp/notresolve-$domain.txt| zdns A --threads 200 --name-servers=1.1.1.1 | jq -r "select(.data.answers[0].name) | .name" > /tmp/bruteforce-$domain.txt
    cat /tmp/bruteforce-$domain.txt | grep -Eo "[a-zA-Z0-9._-]+\.$domain" | sort -u > resolve-$domain.txt
    rm /tmp/notresolve-$domain.txt /tmp/bruteforce-$domain.txt
    echo -e "[+] $domain Brute-Force Done : $(wc -l resolve-$domain.txt | awk '{ print $1}')"
}
bruteforce $1

#*Demo*
#wget https://raw.githubusercontent.com/rbsec/dnscan/master/subdomains-1000.txt -O /tmp/1000.txt
#bash brute.sh tesla.com

#Resolve 
#cat /tmp/notresolve-$domain.txt|zdns A --threads 500 --name-servers=1.1.1.1 | jq -r "select(.data.answers[0].name) | .name" >/tmp/bruteforce-$domain.txt
#cat /tmp/notresolve-$domain.txt|dnsx -l -t 200 -o /tmp/bruteforce-$domain.txt
#cat /tmp/notresolve-$domain.txt|httpx -threads 100 -o /tmp/bruteforce-$domain.txt

Usage

bash brute.sh bing.com

image

Tools Github Links

@cihanmehmet
Copy link
Author

My terminal continuously gets killed while trying to use 33m wordlist, I've tried 15m wordlist which was okay. Could you please give any solution to this problem?

image

When you browse with a large wordlist from your own computer, you cannot get full efficiency due to the speed of your internet. So it's better end when it trades through virtual server. you get it.

As @emadshanab said you can split the wordlist into parts and scan.

@RussellMurad
Copy link

Thank you so much... @emadshanab @cihanmehmet

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment