Skip to content

Instantly share code, notes, and snippets.

@rayyildiz
Last active June 19, 2023 19:28
Show Gist options
  • Save rayyildiz/86e5819e951b5644eb4f77ee513a871b to your computer and use it in GitHub Desktop.
Save rayyildiz/86e5819e951b5644eb4f77ee513a871b to your computer and use it in GitHub Desktop.
REA

Chatgpt demo with OpenLLaMa

LLM chat example using llama_3b

Setup

Install Anaconda and create a new env.

Below is the valid commands for macos . see cheat sheet for windows and linux.

conda create --name chatgpt-310 python=3.10
conda activate chatgpt-310

Running Example

pip install tiktoken transformers torch sentencepiece accelerate
python chat.py

prints:

Q: What is opposite color of black?

A: Black is the opposite of white.

import torch
from transformers import LlamaTokenizer, LlamaForCausalLM
model_path = 'openlm-research/open_llama_3b'
#model_path = 'openlm-research/open_llama_7b'
#model_path = 'openlm-research/open_llama_13b'
tokenizer = LlamaTokenizer.from_pretrained(model_path)
model = LlamaForCausalLM.from_pretrained(
model_path, torch_dtype=torch.float16, device_map='auto',
)
prompt = 'Q: What is opposite color of black?\nA:'
input_ids = tokenizer(prompt, return_tensors="pt").input_ids
generation_output = model.generate(
input_ids=input_ids, max_new_tokens=10
)
print(tokenizer.decode(generation_output[0]))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment