Skip to content

Instantly share code, notes, and snippets.

@ben0it8
Last active July 17, 2019 09:06
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ben0it8/06677e6b6239af33f78b03abb0a06c0b to your computer and use it in GitHub Desktop.
Save ben0it8/06677e6b6239af33f78b03abb0a06c0b to your computer and use it in GitHub Desktop.
Fine-tuning config
from collections import namedtuple
import torch
LOG_DIR = "./logs/"
CACHE_DIR = "./cache/"
device = "cuda" if torch.cuda.is_available() else "cpu"
FineTuningConfig = namedtuple('FineTuningConfig',
field_names="num_classes, dropout, init_range, batch_size, lr, max_norm,"
"n_warmup, valid_pct, gradient_acc_steps, device, log_dir, dataset_cache")
finetuning_config = FineTuningConfig(
2, 0.1, 0.02, BATCH_SIZE, 6.5e-5, 1.0,
10, 0.1, 1, device, LOG_DIR,
CACHE_DIR+'dataset_cache.bin')
finetuning_config
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment