Skip to content

Instantly share code, notes, and snippets.

View graysuit's full-sized avatar

graysuit

View GitHub Profile
@graysuit
graysuit / sql_dork_list
Created March 4, 2022 09:42 — forked from m0k1/sql_dork_list
Google SQL dork list
trainers.php?id=
play_old.php?id=
declaration_more.php?decl_id=
Pageid=
games.php?id=
newsDetail.php?id=
staff_id=
historialeer.php?num=
product-item.php?id=
news_view.php?id=
@graysuit
graysuit / mf_500_Bag_of_Words.py
Created February 26, 2022 19:13 — forked from gupul2k/mf_500_Bag_of_Words.py
NLP: Count frequent words in a file
#Author: Sobhan Hota
#Finds most frequent 500 words in a given file
from string import punctuation
from operator import itemgetter
N = 500
words = {}
words_gen = (word.strip(punctuation).lower() for line in open("C:\Python27\Corpus.txt")
/**
* \file keylogger.h
*
* Copyright (C) 2009-2010, Ongaro Mattia <moongaro at gmail.com>
* All rights reserved.
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.

Agressive URL encode

Python based CLI tool to agressively url-encode strings, rather than just encoding non-url characters this tool will encode every character in the URL.

Usage:

Firstly make a function in your .bash_profile to call the script

function url-encode()
{
 python ~//url_encode.py $@
public static class a
{
public static void main()
{
var abc = "%ABC%";
msgbox(abc);
}
}
@graysuit
graysuit / rfc3161.txt
Created June 15, 2020 12:41 — forked from Manouchehri/rfc3161.txt
List of free rfc3161 servers.
http://timestamp.globalsign.com/scripts/timstamp.dll
https://timestamp.geotrust.com/tsa
http://timestamp.comodoca.com/rfc3161
http://timestamp.wosign.com
http://tsa.startssl.com/rfc3161
http://time.certum.pl
http://timestamp.digicert.com
https://freetsa.org
http://dse200.ncipher.com/TSS/HttpTspServer
http://tsa.safecreative.org
@graysuit
graysuit / Bots & Crawlers User-Agent list.txt
Last active January 6, 2021 17:45
Nowadays, every professional website have their own bots and crawlers. These are used to fetch results from link/URL when a user share them. They track links to avoid phishing, scams and also for better link preview.
𝐆𝐨𝐨𝐠𝐥𝐞:
Googlebot
Googlebot/2.1
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
𝐅𝐚𝐜𝐞𝐛𝐨𝐨𝐤 & 𝐈𝐧𝐬𝐭𝐚𝐠𝐫𝐚𝐦 & 𝐌𝐞𝐬𝐬𝐞𝐧𝐠𝐞𝐫:
Facebot
facebookexternalhit
facebookexternalhit/1.1