Skip to content

Instantly share code, notes, and snippets.

@virtualbeck
Last active February 18, 2024 02:02
Show Gist options
  • Save virtualbeck/c8415b65b54311f92772df22e1115ce5 to your computer and use it in GitHub Desktop.
Save virtualbeck/c8415b65b54311f92772df22e1115ce5 to your computer and use it in GitHub Desktop.
Get traefik up and running with ollama + open-webui (WIP)
#docker-compose.yaml
version: '3.8'
services:
traefik:
image: traefik:v2.10
# Enables the web UI and tells Traefik to listen to docker
container_name: "traefik"
command:
- "--api.insecure=true"
- "--providers.docker=true"
- "--providers.docker.exposedbydefault=false"
- "--entrypoints.websecure.address=:443"
- "--certificatesresolvers.myresolver.acme.tlschallenge=true"
- "--certificatesresolvers.myresolver.acme.email=me@email.com"
- "--certificatesresolvers.myresolver.acme.storage=/letsencrypt/acme.json"
- "--log.level=INFO"
- "--log.filePath=/logs/traefik.log"
- "--accesslog=true"
- "--accesslog.filePath=/logs/access.log"
- "--accesslog.bufferingsize=50"
ports:
- "8082:8080"
- "443:443"
volumes:
- "./letsencrypt:/letsencrypt"
- "/var/run/docker.sock:/var/run/docker.sock:ro"
- "./logs/:/logs/"
whoami:
# A container that exposes an API to show its IP address
image: traefik/whoami
container_name: "whoami"
labels:
- "traefik.enable=true"
- "traefik.http.routers.whoami.rule=Host(`whoami.example.com`)"
- "traefik.http.routers.whoami.entrypoints=websecure"
- "traefik.http.routers.whoami.tls.certresolver=myresolver"
nextchat:
# ChatGPT skin with custom stuff
image: yidadaa/chatgpt-next-web
container_name: "nextchat"
labels:
- "traefik.enable=true"
- "traefik.http.routers.nextchat.rule=Host(`nextchat.example.com`)"
- "traefik.http.routers.nextchat.entrypoints=websecure"
- "traefik.http.routers.nextchat.tls.certresolver=myresolver"
ollama:
# Ollama makes it easy to get up and running with large language models locally.
image: ollama/ollama
restart: unless-stopped
ports:
- "11434:11434"
volumes:
- "~/ollama:/root/.ollama"
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities:
- gpu
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- "~/open-webui:/app/backend/data"
depends_on:
- ollama
ports:
- 3000:8080
environment:
- 'OLLAMA_API_BASE_URL=http://ollama:11434/api'
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
labels:
- "traefik.enable=true"
- "traefik.http.routers.ollama.rule=Host(`ollama.example.com`)"
- "traefik.http.routers.ollama.entrypoints=websecure"
- "traefik.http.routers.ollama.tls.certresolver=myresolver"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment