Skip to content

Instantly share code, notes, and snippets.

@jrbourbeau
Last active May 9, 2024 20:02
Show Gist options
  • Save jrbourbeau/f20963f1d7f04a3412fe38927eb76bad to your computer and use it in GitHub Desktop.
Save jrbourbeau/f20963f1d7f04a3412fe38927eb76bad to your computer and use it in GitHub Desktop.
Ollama on Coiled
import coiled
import ollama
from distributed import print
@coiled.function(name="ollama")
def chat():
model = "llama3:latest"
print(f"Pulling {model} model...")
ollama.pull(model)
print(f"Done pulling {model}")
stream = ollama.chat(
model=model,
messages=[{"role": "user", "content": "Why is the sky blue?"}],
stream=True,
)
for chunk in stream:
print(chunk["message"]["content"], end="", flush=True)
chat()
#!/usr/bin/env bash
curl https://ollama.ai/install.sh | sh
export OLLAMA_HOST="0.0.0.0"
ollama serve
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment