Skip to content

Instantly share code, notes, and snippets.

@rezamarzban
Last active July 14, 2024 12:58
Show Gist options
  • Save rezamarzban/f89c89a3fff5fcb5253f753d8b6ee155 to your computer and use it in GitHub Desktop.
Save rezamarzban/f89c89a3fff5fcb5253f753d8b6ee155 to your computer and use it in GitHub Desktop.
GPT 3.5
from transformers import GPTNeoForCausalLM, GPT2Tokenizer
# Initialize tokenizer and model
tokenizer = GPT2Tokenizer.from_pretrained("EleutherAI/gpt-neo-2.7B")
model = GPTNeoForCausalLM.from_pretrained("EleutherAI/gpt-neo-2.7B")
# Example usage
text = "Replace with your input text here."
# Tokenize input text
inputs = tokenizer(text, return_tensors="pt")
# Generate output
with tokenizer.as_target_tokenizer():
outputs = model.generate(**inputs, max_length=100, num_return_sequences=1)
# Decode output tokens to text
output_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
# Print or use the generated text
print("Input Text:", text)
print("Generated Text:", output_text)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment