Skip to content

Instantly share code, notes, and snippets.

@sappho192
Created February 26, 2024 08:48
Show Gist options
  • Save sappho192/dd4d0c3c0e7e7cca1dc5931f103d643d to your computer and use it in GitHub Desktop.
Save sappho192/dd4d0c3c0e7e7cca1dc5931f103d643d to your computer and use it in GitHub Desktop.
DaramGPT inference example
from transformers import AutoTokenizer, AutoModelForCausalLM, GPTJConfig, GPTJForCausalLM
import torch
tokenizer = AutoTokenizer.from_pretrained("d:/MODEL/DaramGPT")
model = GPTJForCausalLM.from_pretrained("d:/MODEL/DaramGPT")
tokens = tokenizer("서울의 날씨는 ")
print(tokenizer.decode(model.generate(**{
"input_ids": torch.tensor([tokens["input_ids"]]),
"attention_mask": torch.tensor([tokens["attention_mask"]]),
"max_length": 256,
"repetition_penalty": 1.1
})[0], skip_special_tokens=True))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment