Skip to content

Instantly share code, notes, and snippets.

@diegofcornejo
Last active June 16, 2024 07:11
Show Gist options
  • Save diegofcornejo/57b7eb992e963b6136fbb0b9e2b30dbd to your computer and use it in GitHub Desktop.
Save diegofcornejo/57b7eb992e963b6136fbb0b9e2b30dbd to your computer and use it in GitHub Desktop.
Run Ollama Server and Open WebUI with Docker Compose.

ollama-open-webui

This project sets up Ollama and Open WebUI using Docker Compose.

Instructions

  1. Start the services with Docker Compose:

    docker-compose up -d
    # or
    docker compose up -d
  2. Once started, run the models with:

    docker exec -it ollama ollama run qwen2:1.5b
    # or download models using Open WebUI.
name: 'ollama-open-webui'
services:
ollama:
image: ollama/ollama
container_name: ollama
pull_policy: always
tty: true
restart: unless-stopped
volumes:
- ollama:/root/.ollama
environment:
- 'OLLAMA_MAX_LOADED_MODELS=4'
- 'OLLAMA_NUM_PARALLEL=4'
- 'OLLAMA_KEEP_ALIVE=-1'
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 8080:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
extra_hosts:
- "host.docker.internal:host-gateway"
restart: unless-stopped
volumes:
ollama:
open-webui:
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment