Skip to content

Instantly share code, notes, and snippets.

@manujosephv
Created November 26, 2020 16:40
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save manujosephv/270477279a5598d4ca1dc663e60927f0 to your computer and use it in GitHub Desktop.
Save manujosephv/270477279a5598d4ca1dc663e60927f0 to your computer and use it in GitHub Desktop.
tokenizer = AutoTokenizer.from_pretrained("./storage/gpt2-motivational_v6")
model = AutoModelWithLMHead.from_pretrained("./storage/gpt2-motivational_v6")
gpt2_finetune = pipeline('text-generation', model=model, tokenizer=tokenizer)
# gen_kwargs has different options like max_length,
# beam_search options, top-p, top-k, etc
gen_text = gpt2_finetune (seed, *gen_kwargs)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment