Skip to content

Instantly share code, notes, and snippets.

View ben0it8's full-sized avatar

ben_oght_ah_eight ben0it8

  • Aignostics
  • Berlin
View GitHub Profile
@ben0it8
ben0it8 / finetuning_config.py
Last active July 17, 2019 09:06
Fine-tuning config
from collections import namedtuple
import torch
LOG_DIR = "./logs/"
CACHE_DIR = "./cache/"
device = "cuda" if torch.cuda.is_available() else "cpu"
FineTuningConfig = namedtuple('FineTuningConfig',
field_names="num_classes, dropout, init_range, batch_size, lr, max_norm,"
@ben0it8
ben0it8 / load_pretrained_transformer.py
Last active July 17, 2019 09:00
load pretrained NAACL Transformer
from pytorch_transformers import cached_path
# download pre-trained model and config
state_dict = torch.load(cached_path("https://s3.amazonaws.com/models.huggingface.co/"
"naacl-2019-tutorial/model_checkpoint.pth"), map_location='cpu')
config = torch.load(cached_path("https://s3.amazonaws.com/models.huggingface.co/"
"naacl-2019-tutorial/model_training_args.bin"))
# init model: Transformer base + classifier head
@ben0it8
ben0it8 / transformer_models.py
Last active July 12, 2019 13:27
Transformer models
import torch.nn as nn
import torch
class Transformer(nn.Module):
"Adopted from https://github.com/huggingface/naacl_transfer_learning_tutorial"
def __init__(self, embed_dim, hidden_dim, num_embeddings, num_max_positions,
num_heads, num_layers, dropout, causal):
super().__init__()
self.causal = causal
@ben0it8
ben0it8 / cfg-init
Last active March 3, 2019 22:01
initialize dotfiles
sh -c "$(curl -fsSL https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh)"
echo "ZSH=$HOME/.oh-my-zsh" >> ~/.zshrc
echo "source $ZSH/oh-my-zsh.sh" >> ~/.zshrc
echo "ZSH_THEME='robbyrussel'" >> ~/.zshrc
echo "plugins = (git python osx web-search vi-mode dotenv" >> ~/.zshrc
echo "alias config='/usr/bin/git --git-dir=$HOME/.cfg/ --work-tree=$HOME'" >> ~/.zshrc
source ~/.zshrc
echo ".cfg" >> .gitignore
git clone --bare https://github.com/ben0it8/dotfiles.git .cfg/
config checkout