Skip to content

Instantly share code, notes, and snippets.

View ThoenigAdrian's full-sized avatar
🎯
Focusing

ThoenigAdrian

🎯
Focusing
View GitHub Profile
@cobryan05
cobryan05 / fixNvPe.py
Last active January 25, 2024 14:33
Python Script to disable ASLR and make nv fatbins read-only to reduce memory commit
# Simple script to disable ASLR and make .nv_fatb sections read-only
# Requires: pefile ( python -m pip install pefile )
# Usage: fixNvPe.py --input path/to/*.dll
import argparse
import pefile
import glob
import os
import shutil
@denizssch
denizssch / XpSerials.txt
Created July 21, 2019 00:13
Windows XP ALL Serial Keys :) (For testing purpose [Ex: VM or PenTest])
FCKGW-RHQQ2-YXRKT-8TG6W-2B7Q8
Windows XP PRO Corporate serial number S/N: Key: MQPWW-PGVKX-YPMKG-8DH3G-KC8PW
windows xp home edition serial number S/N: 034634-262024-171505-828316-729010-413531-800424-400442
Windows XP 64 serial number S/N: B2RBK-7KPT9-4JP6X-QQFWM-PJD6G
Windows XP serial number S/N: K6C2K-KY62K-DQR84-RD4QV-QB74Q
Windows XP Professional 64-bit Corporate Edition 5.2.3790.1830 serial number S/N: VCFQD-V9FX9-46WVH-K3CD4-4J3JM
Microsoft Windows XP Professional SP2 serial number S/N: YY8F2-3CKVQ-RKTRG-6JMDR-9DTG6
Windows XP Professional Service Pack 1 sp1 serial number S/N: F46YY - 2R8VQ - R8GMY - 926VK - 6BQ73
Windows XP Pro serial number S/N: KBWR7-76BD8-J7MDQ-KKG&C-V9Q2J
@Lexie88rus
Lexie88rus / silu_inplace_implementation.py
Created July 10, 2019 08:16
Example of implementation of in-place SiLU activation function
def silu_inplace_2(input):
'''
Example of implementation of in-place SiLU activation function using torch.sigmoid_
https://arxiv.org/pdf/1606.08415.pdf
'''
result = input.clone()
torch.sigmoid_(input)
input *= result
return input
@Lexie88rus
Lexie88rus / SiLU_demo_class.py
Created June 27, 2019 08:44
SiLU demo (in class)
# create class for basic fully-connected deep neural network
class ClassifierSiLU(nn.Module):
'''
Demo classifier model class to demonstrate SiLU
'''
def __init__(self):
super().__init__()
# initialize layers
self.fc1 = nn.Linear(784, 256)
@Lexie88rus
Lexie88rus / SiLU.py
Created June 27, 2019 08:38
SiLU implementation
# simply define a silu function
def silu(input):
'''
Applies the Sigmoid Linear Unit (SiLU) function element-wise:
SiLU(x) = x * sigmoid(x)
'''
return input * torch.sigmoid(input) # use torch.sigmoid to make sure that we created the most efficient implemetation based on builtin PyTorch functions
# create a class wrapper from PyTorch nn.Module, so