Skip to content

Instantly share code, notes, and snippets.

@caleb-kaiser
Created November 6, 2019 23:16
Show Gist options
  • Save caleb-kaiser/3ff9a3d8f38ef1a461c5141bea70c3ff to your computer and use it in GitHub Desktop.
Save caleb-kaiser/3ff9a3d8f38ef1a461c5141bea70c3ff to your computer and use it in GitHub Desktop.
# predictor.py
def predict(sample, metadata):
indexed_tokens = tokenizer.encode(sample["text"])
output = sample_sequence(model, metadata['num_words'], indexed_tokens, device=metadata['device'])
return tokenizer.decode(
output[0, 0:].tolist(), clean_up_tokenization_spaces=True, skip_special_tokens=True
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment