Skip to content

Instantly share code, notes, and snippets.

@aiXpertLab
Created February 29, 2024 00:41
Show Gist options
  • Save aiXpertLab/a74b8c593bf52c9028e1fd930aa1e15a to your computer and use it in GitHub Desktop.
Save aiXpertLab/a74b8c593bf52c9028e1fd930aa1e15a to your computer and use it in GitHub Desktop.
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "E:/models/2b-gemma"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model=AutoModelForCausalLM.from_pretrained(model_name, device_map="auto")
input_text = "write me a poem about beautifuly woman."
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
outputs = model.generate(**input_ids)
print(tokenizer.decode(outputs[0]))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment