ollama/ollama:latest
11434/tcp REST Endpoint
ollama data and models
- /root/.ollama
- log into docker console
- ollama pull orca2 && ollama pull llama2
- --- create a new model: https://gist.github.com/BananaAcid/bf47553432b72ceea8c4ae3aaf39ca9f
- test: ollama run orca2
ghcr.io/ollama-webui/ollama-webui:main --> has user management + registration, saves logs to server and user account
OR: https://github.com/ollama-webui/ollama-webui-lite --> saves to Browser cache (Localstorage), No login interface
8080/tcp WebUI
OLLAMA_API_BASE_URL = http://192.168.178.10:27080/api -- the url to the oLLaMA API Server
@see: https://github.com/jmorganca/ollama/blob/main/docs/linux.md
curl https://ollama.ai/install.sh | sh
NOTE: Instance is not accessable from another IP, it requires: OLLAMA_HOST=0.0.0.0:11434
Endless command:
model=mixtral ; sleepDuration=4m ; nvidia-smi ; counter=0; while true; do ((counter++)) && echo "keep alive # $counter - $(date) ..." && http --timeout=120 :11434/api/generate model=$model stream:=false prompt='Only say "1". Do not explain. Do not comment. Short response only.' && echo "sleeping $sleepDuration for $model (# $counter, ctrl+Z to stop) - $(date)" && sleep $sleepDuration; done;
Setup folders where to place our customizations
mkdir /srv/ollama_models
mkdir /srv/ollama_tools
sudo chown ollama: /srv/ollama_models
sudo chmod 0777 /srv/ollama_models
sudo chown ollama: /srv/ollama_tools
sudo chmod 0777 /srv/ollama_tools
# for current user to be able to access the the ollama folders above without any trouble (like by `ollama serve`)
usermod -a -G ollama $USERNAME
/srv/ollama_tools/init-keepalive-mixtral.sh
#!/bin/bash
model=mixtral
sleepDuration=4m
nvidia-smi
counter=0
while true
do
((counter++))
echo "keep alive # $counter - $(date) ..."
http --ignore-stdin --timeout=120 :11434/api/generate model=$model stream:=false prompt='Only say "1". Do not explain. Do not comment. Short response only.' &&
echo "sleeping $sleepDuration for $model (# $counter, ctrl+Z to stop) - $(date)" &&
sleep $sleepDuration
done
add to be used (make it eXectuable):
sudo chmod +x /srv/ollama-tools/init-keepalive-mixtral.sh
/srv/ollama_tools/init-keepalive-mixtral.service
# save as /etc/systemd/system/init-keepalive-mixtral.service
[Unit]
Description=Ollama Init and Keepalive Service
After=network-online.target ollama.service
[Service]
ExecStart=/srv/ollama-tools/init-keepalive-mixtral.sh
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/home/admin/.local/bin:/home/admin/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin"
[Install]
WantedBy=default.target
add to be used:
sudo cp /srv/ollama-tools/init-keepalive-mixtral.service /etc/systemd/system/init-keepalive-mixtral.service
/srv/ollama_tools/ollama.service
# save as /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
Environment=OLLAMA_MODELS=/srv/ollama-models
Environment=OLLAMA_HOST=0.0.0.0:11434
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/home/admin/.local/bin:/home/admin/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin"
[Install]
WantedBy=default.target
... adding OLLAMA_MODELS
and OLLAMA_HOSTS
Note:
in cli, use the cmd below to be able to run nearly the same as the service (user is different, but usermod current user to ollama group should help) :
OLLAMA_MODELS=/srv/ollama-models OLLAMA_HOST=0.0.0.0:11434 ollama serve
Helpful commands:
systemctl enable ollama # or: enable ollama.service
systemctl start ollama
systemctl disable ollama # will remove the symlink to the running folder, needed if service file is updated
systemctl stop ollama
sudo systemctl daemon-reload # after any change to the service files, it will re-symlink
# get full log of service (-b .. since boot, -e .. start at end, --no-pager ... no page pausing)
journalctl -u ollama -e
# - same - (-l, complete log without truncating)
systemctl -l status ollama
# live watch the log
journalctl -xefu ollama
journalctl -xefu init-keepalive-mixtral
# fully run as the ollama user with the env set as in the service
runuser -u ollama -- OLLAMA_MODELS=/srv/ollama-models OLLAMA_HOST=0.0.0.0:11434 ollama serve