Skip to content

Instantly share code, notes, and snippets.

@webghostx
Last active June 2, 2024 09:09
Show Gist options
  • Save webghostx/c630191e321263e34d2511eb2720bf76 to your computer and use it in GitHub Desktop.
Save webghostx/c630191e321263e34d2511eb2720bf76 to your computer and use it in GitHub Desktop.
Run Open WebUi in Docker with Ollama on the host

Run Open WebUi Container with Ollama on the host

The usual method to install Open WebUI with Ollama involves integrating Ollama directly into the container. However, if you intend to utilize Ollama independently for other purposes, it's preferable to exclude it.

Requirement:

Ollama must be installed e.g. with curl -fsSL https://ollama.com/install.sh | sh

Goal:

Integrate locally installed Ollama into Open WebUI while enabling access to Ollama from both within the Docker container and from the host.

By using --network=host, Ollama becomes accessible from the container, allowing access from both sides at http://localhost:11434. Consequently, Open WebUI is accessible at http://localhost:8080.

Docker Command:

(Note: must be adapted)

docker run \
    --add-host=host.docker.internal:host-gateway \
    -d \
    --network=host \
    --gpus all \
    -e WEBUI_AUTH=false \
    -e RAG_ENABLED=true \
    -e RAG_TEMPLATE=/app/backend/data/templates/rag-template.txt \
    -e LOCAL_FILES_ONLY=False \
    -e OLLAMA_BASE_URL=http://localhost:11434 \
    -v /home/yves/Apps/open-webui/bind-mount:/app/backend/data \
    --name open-webui \
    --restart always \
    ghcr.io/open-webui/open-webui:main

(Note: If anyone can optimize the command, I would be happy. I'm not a Docker specialist.)

Parameter Explanation:

  • --add-host=host.docker.internal:host-gateway: Includes the host's IP address as host.docker.internal in the Docker container, facilitating access to the host.
  • -d: Operates the container in detached mode (background).
  • --network=host: Utilizes the host's network, enabling the container to use the same network interfaces as the host.
  • --gpus all: Grants access to all GPUs on the host from within the container.
  • Environment Variables (-e):
    • WEBUI_AUTH=false: Disables authentication for Open WebUI.
    • RAG_ENABLED=true: Activates the RAG model in Open WebUI.
    • RAG_TEMPLATE=/app/backend/data/templates/rag-template.txt: Specifies the path to the RAG template within the container.
    • LOCAL_FILES_ONLY=False: Permits access to local files outside the container.
    • OLLAMA_BASE_URL=http://localhost:11434: Defines the URL where Ollama on the host can be reached.
  • -v /home/yves/Apps/open-webui/bind-mount:/app/backend/data: Mounts the local directory /home/yves/Apps/open-webui/bind-mount into the container, enabling data sharing for Open WebUI.
    • This bind mount ensures persistent data storage and accessibility between the host filesystem and the Docker container.

Additional Information:

  • Ollama: Platform for managing and deploying models.
  • Open WebUI: Open-source web interface for managing models and data within Docker containers.
  • Open WebUI Snap: As a snap, but unfortunately not the latest version

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment