Last active
March 10, 2025 01:26
-
-
Save JBGruber/73f9f49f833c6171b8607b976abc0ddc to your computer and use it in GitHub Desktop.
My compose file to run ollama and ollama-webui
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
services: | |
# ollama and API | |
ollama: | |
image: ollama/ollama:latest | |
container_name: ollama | |
pull_policy: missing | |
tty: true | |
restart: unless-stopped | |
# Expose Ollama API outside the container stack (but only on the same computer; | |
# remove 127.0.0.1: to make Ollama available on your network) | |
ports: | |
- 127.0.0.1:11434:11434 | |
volumes: | |
- ollama:/root/.ollama | |
# GPU support (turn off by commenting with # if you don't have an nvidia gpu) | |
deploy: | |
resources: | |
reservations: | |
devices: | |
- driver: nvidia | |
count: 1 | |
capabilities: | |
- gpu | |
# webui, nagivate to http://localhost:3000/ to use | |
open-webui: | |
image: ghcr.io/open-webui/open-webui:main | |
container_name: open-webui | |
pull_policy: missing | |
volumes: | |
- open-webui:/app/backend/data | |
depends_on: | |
- ollama | |
ports: | |
- 3000:8080 | |
environment: | |
- "OLLAMA_API_BASE_URL=http://ollama:11434/api" | |
extra_hosts: | |
- host.docker.internal:host-gateway | |
restart: unless-stopped | |
volumes: | |
ollama: {} | |
open-webui: {} |
Why do you expose the port reserved for DNS?
I don't remember and I don't see any advantage now. So I just removed it.
👍
Hi @JBGruber . I'm trying to update the OpenWeb UI. Can you guide me?
Thank you.
Sure! This is how you should do it:
docker-compose down
docker-compose pull --policy "always"
docker-compose up -d
I'm not sure why, but without setting --policy "always"
, it updates only Ollama, not open-webui.
Sure! This is how you should do it:
docker-compose down docker-compose pull --policy "always" docker-compose up -d
I'm not sure why, but without setting
--policy "always"
, it updates only Ollama, not open-webui.
Thank you so much @JBGruber . This really help.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Download compose file:
Download and run ollama (or comment out lines 17-24 if you do not have an Nvidia GPU):
Update:
Note, this is the Nvidia-GPU version. Once it's running, navigate to http://localhost:3000/ (or reaplce localhost with the IP address of the computer this is running on).