Skip to content

Instantly share code, notes, and snippets.

@jimbrig
Forked from JBGruber/docker-compose.yml
Created April 29, 2024 19:01
Show Gist options
  • Save jimbrig/f877e4dc943946c55a47317c59e138b4 to your computer and use it in GitHub Desktop.
Save jimbrig/f877e4dc943946c55a47317c59e138b4 to your computer and use it in GitHub Desktop.
My compose file to run ollama and ollama-webui
version: '3.6'
services:
# ollama and API
ollama:
image: ollama/ollama:latest
container_name: ollama
pull_policy: missing
tty: true
restart: unless-stopped
# Expose Ollama API outside the container stack
ports:
- 11434:11434
volumes:
- ollama:/root/.ollama
# GPU support (turn off by commenting with # if you don't have an nvidia gpu)
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities:
- gpu
# webui, nagivate to http://localhost:3000/ to use
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
pull_policy: missing
volumes:
- open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- "OLLAMA_API_BASE_URL=http://ollama:11434/api"
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama: {}
open-webui: {}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment