-
Create a Python virtial environment.
- python3 -m venv ollama
-
To install
- Open Terminal - run: curl -fsSL https://ollama.com/install.sh | sh
-
Download model of your choice:
- ollama pull 'MODEL_NAME' #llama3 (Works on 4GB VRAM)
-
Create the following script and save
test.py
import ollama
# Construct the prompt
full_prompt = f"{context_prompt} What is a blob loss?"
# Generate a response
response = ollama.generate(model='llama3', prompt=full_prompt)
print(response['response'])
cd
into your terminal and runpython test.py