Skip to content

Instantly share code, notes, and snippets.

@JBGruber
Last active March 10, 2025 01:26
Show Gist options
  • Save JBGruber/73f9f49f833c6171b8607b976abc0ddc to your computer and use it in GitHub Desktop.
Save JBGruber/73f9f49f833c6171b8607b976abc0ddc to your computer and use it in GitHub Desktop.
My compose file to run ollama and ollama-webui
services:
# ollama and API
ollama:
image: ollama/ollama:latest
container_name: ollama
pull_policy: missing
tty: true
restart: unless-stopped
# Expose Ollama API outside the container stack (but only on the same computer;
# remove 127.0.0.1: to make Ollama available on your network)
ports:
- 127.0.0.1:11434:11434
volumes:
- ollama:/root/.ollama
# GPU support (turn off by commenting with # if you don't have an nvidia gpu)
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities:
- gpu
# webui, nagivate to http://localhost:3000/ to use
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
pull_policy: missing
volumes:
- open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- "OLLAMA_API_BASE_URL=http://ollama:11434/api"
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama: {}
open-webui: {}
@JBGruber
Copy link
Author

JBGruber commented Jan 23, 2024

Download compose file:

wget https://gist.githubusercontent.com/JBGruber/73f9f49f833c6171b8607b976abc0ddc/raw/docker-compose.yml

Download and run ollama (or comment out lines 17-24 if you do not have an Nvidia GPU):

docker-compose up -d

Update:

docker-compose pull

Note, this is the Nvidia-GPU version. Once it's running, navigate to http://localhost:3000/ (or reaplce localhost with the IP address of the computer this is running on).

@JBGruber
Copy link
Author

JBGruber commented Mar 1, 2024

More information on how to run this is now available here:

Docker + Ollama install

@luispabon
Copy link

Why do you expose the port reserved for DNS?

@JBGruber
Copy link
Author

JBGruber commented Oct 1, 2024

I don't remember and I don't see any advantage now. So I just removed it.

@luispabon
Copy link

👍

@DrJohan
Copy link

DrJohan commented Feb 25, 2025

Hi @JBGruber . I'm trying to update the OpenWeb UI. Can you guide me?
Thank you.

@JBGruber
Copy link
Author

JBGruber commented Mar 2, 2025

Sure! This is how you should do it:

docker-compose down
docker-compose pull --policy "always"
docker-compose up -d

I'm not sure why, but without setting --policy "always", it updates only Ollama, not open-webui.

@DrJohan
Copy link

DrJohan commented Mar 10, 2025

Sure! This is how you should do it:

docker-compose down
docker-compose pull --policy "always"
docker-compose up -d

I'm not sure why, but without setting --policy "always", it updates only Ollama, not open-webui.

Thank you so much @JBGruber . This really help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment