Skip to content

Instantly share code, notes, and snippets.

@chunhualiao
Last active April 16, 2024 12:17
Show Gist options
  • Save chunhualiao/0656f951076bb761e11fccf72f342c5f to your computer and use it in GitHub Desktop.
Save chunhualiao/0656f951076bb761e11fccf72f342c5f to your computer and use it in GitHub Desktop.
Example code to use CodeLlama 7B Instruct model HuggingFace version

Based on discussions on https://huggingface.co/codellama/CodeLlama-7b-Instruct-hf/discussions/10

# Use a pipeline as a high-level helper
from transformers import pipeline
from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig

tokenizer = AutoTokenizer.from_pretrained("codellama/CodeLlama-7b-Instruct-hf")
model = AutoModelForCausalLM.from_pretrained("codellama/CodeLlama-7b-Instruct-hf")


# Create a pipeline
code_generator = pipeline('text-generation', model=model, tokenizer=tokenizer)

# Generate code for an input string
input_string = "Write a python function to calculate the factorial of a number"
generated_code = code_generator(input_string, max_length=100)[0]['generated_text']
print(generated_code)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment