Skip to content

Instantly share code, notes, and snippets.

@screetsec
Last active May 25, 2023 16:16
Show Gist options
  • Save screetsec/6ee948503960f1b9d4b7b8465aea2d73 to your computer and use it in GitHub Desktop.
Save screetsec/6ee948503960f1b9d4b7b8465aea2d73 to your computer and use it in GitHub Desktop.
One Liner to get Hidden URL Parameter from Passive scan using Web Archive. Regex using DFA Engine, Support and Collecting URL with multi Parameter to Fuzzing & Removing Duplicate
curl -s "http://web.archive.org/cdx/search/cdx?url=*.bugcrowd.com/*&output=text&fl=original&collapse=urlkey" | grep -P "=" | sed "/\b\(jpg\|png\|js\|svg\|css\|gif\|jpeg\|woff\|woff2\)\b/d" > Output.txt ; for i in $(cat Output.txt);do URL="${i}"; LIST=(${URL//[=&]/=FUZZ&}); echo ${LIST} | awk -F'=' -vOFS='=' '{$NF="FUZZ"}1;' >> Passive_Collecting_URLParamter.txt ; done ; rm Output.txt ; sort -u Passive_Collecting_URLParamter.txt > Passive_Collecting_URLParamter_Uniq.txt
@slowmistio
Copy link

grep -P ?

error:
usage: grep [-abcDEFGHhIiJLlmnOoqRSsUVvwxZ] [-A num] [-B num] [-C[num]]
[-e pattern] [-f file] [--binary-files=value] [--color=when]
[--context[=num]] [--directories=action] [--label] [--line-buffered]
[--null] [pattern] [file ...]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment