Skip to content

Instantly share code, notes, and snippets.

@jaypeche
Created December 29, 2024 01:32
Show Gist options
  • Save jaypeche/55d6c1fb1f6799a6ee027deb3e9bd3a9 to your computer and use it in GitHub Desktop.
Save jaypeche/55d6c1fb1f6799a6ee027deb3e9bd3a9 to your computer and use it in GitHub Desktop.
ollama-bin-0.5.4-r2 standard gentoo install
These are the packages that would be merged, in order:
Calculating dependencies
* IMPORTANT: 1 news items need reading for repository 'gentoo'.
* Use eselect news read to view new items.
... done!
Dependency resolution took 1.80 s (backtrack: 0/20).
[ebuild N ] acct-group/ollama-0-r1::strix-overlay 0 KiB
[ebuild N ] acct-user/ollama-0-r1::strix-overlay 0 KiB
[ebuild N ~] app-misc/ollama-bin-0.5.4-r2::strix-overlay USE="nvidia systemd -amd" 0 KiB
Total: 3 packages (3 new), Size of downloads: 0 KiB
>>> Verifying ebuild manifests
>>> Running pre-merge checks for acct-group/ollama-0-r1
>>> Running pre-merge checks for acct-user/ollama-0-r1
>>> Running pre-merge checks for app-misc/ollama-bin-0.5.4-r2
>>> Emerging (1 of 3) acct-group/ollama-0-r1::strix-overlay
>>> Unpacking source...
>>> Source unpacked in /var/tmp/portage/acct-group/ollama-0-r1/work
>>> Preparing source in /var/tmp/portage/acct-group/ollama-0-r1/work ...
>>> Source prepared.
>>> Configuring source in /var/tmp/portage/acct-group/ollama-0-r1/work ...
>>> Source configured.
>>> Compiling source in /var/tmp/portage/acct-group/ollama-0-r1/work ...
>>> Source compiled.
>>> Test phase [not enabled]: acct-group/ollama-0-r1
>>> Install acct-group/ollama-0-r1 into /var/tmp/portage/acct-group/ollama-0-r1/image
>>> Completed installing acct-group/ollama-0-r1 into /var/tmp/portage/acct-group/ollama-0-r1/image
* Final size of build directory: 0 KiB
* Final size of installed tree: 4 KiB
>>> Installing (1 of 3) acct-group/ollama-0-r1::strix-overlay
* checking 1 files for package collisions
>>> Merging acct-group/ollama-0-r1 to /
* Group ollama already exists
--- /usr/
--- /usr/lib/
--- /usr/lib/sysusers.d/
>>> /usr/lib/sysusers.d/acct-group-ollama.conf
>>> acct-group/ollama-0-r1 merged.
>>> Completed (1 of 3) acct-group/ollama-0-r1::strix-overlay
>>> Emerging (2 of 3) acct-user/ollama-0-r1::strix-overlay
>>> Unpacking source...
>>> Source unpacked in /var/tmp/portage/acct-user/ollama-0-r1/work
>>> Preparing source in /var/tmp/portage/acct-user/ollama-0-r1/work ...
>>> Source prepared.
>>> Configuring source in /var/tmp/portage/acct-user/ollama-0-r1/work ...
>>> Source configured.
>>> Compiling source in /var/tmp/portage/acct-user/ollama-0-r1/work ...
>>> Source compiled.
>>> Test phase [not enabled]: acct-user/ollama-0-r1
>>> Install acct-user/ollama-0-r1 into /var/tmp/portage/acct-user/ollama-0-r1/image
>>> Completed installing acct-user/ollama-0-r1 into /var/tmp/portage/acct-user/ollama-0-r1/image
* Final size of build directory: 0 KiB
* Final size of installed tree: 4 KiB
>>> Installing (2 of 3) acct-user/ollama-0-r1::strix-overlay
* checking 2 files for package collisions
>>> Merging acct-user/ollama-0-r1 to /
* User ollama already exists
--- /usr/
--- /usr/lib/
--- /usr/lib/sysusers.d/
>>> /usr/lib/sysusers.d/acct-user-ollama.conf
--- /opt/
--- /opt/ollama-bin/
>>> /opt/ollama-bin/.keep_acct-user_ollama-0
* Updating user ollama
>>> acct-user/ollama-0-r1 merged.
>>> Completed (2 of 3) acct-user/ollama-0-r1::strix-overlay
>>> Emerging (3 of 3) app-misc/ollama-bin-0.5.4-r2::strix-overlay
* ollama-linux-amd64.tgz BLAKE2B SHA512 size ;-) ... [ ok ]
* Checking for at least 4 GiB disk space at "/var/tmp/notmpfs/portage/app-misc/ollama-bin-0.5.4-r2/temp" ...
 [ ok ]
>>> Unpacking source...
>>> Unpacking ollama-linux-amd64.tgz to /var/tmp/notmpfs/portage/app-misc/ollama-bin-0.5.4-r2/work
>>> Source unpacked in /var/tmp/notmpfs/portage/app-misc/ollama-bin-0.5.4-r2/work
>>> Preparing source in /var/tmp/notmpfs/portage/app-misc/ollama-bin-0.5.4-r2/work ...
>>> Source prepared.
>>> Configuring source in /var/tmp/notmpfs/portage/app-misc/ollama-bin-0.5.4-r2/work ...
>>> Source configured.
>>> Compiling source in /var/tmp/notmpfs/portage/app-misc/ollama-bin-0.5.4-r2/work ...
>>> Source compiled.
>>> Test phase [not enabled]: app-misc/ollama-bin-0.5.4-r2
>>> Install app-misc/ollama-bin-0.5.4-r2 into /var/tmp/notmpfs/portage/app-misc/ollama-bin-0.5.4-r2/image
*
* INFO: Models and checksums saved into /opt/ollama-bin/.ollama are preserved...
*
>>> Completed installing app-misc/ollama-bin-0.5.4-r2 into /var/tmp/notmpfs/portage/app-misc/ollama-bin-0.5.4-r2/image
* Final size of build directory: 3595784 KiB (3.4 GiB)
* Final size of installed tree: 3595816 KiB (3.4 GiB)
strip: x86_64-pc-linux-gnu-strip --strip-unneeded -N __gentoo_check_ldflags__ -R .comment -R .GCC.command.line -R .note.gnu.gold-version
/opt/ollama-bin/lib/ollama/libcudart.so.11.3.109
/opt/ollama-bin/lib/ollama/libcudart.so.12.4.127
/opt/ollama-bin/lib/ollama/libcublasLt.so.11.5.1.109
/opt/ollama-bin/lib/ollama/libcublas.so.12.4.5.8
/opt/ollama-bin/lib/ollama/runners/cuda_v11_avx/ollama_llama_server
/opt/ollama-bin/lib/ollama/libcublas.so.11.5.1.109
/opt/ollama-bin/lib/ollama/runners/cpu_avx/ollama_llama_server
/opt/ollama-bin/lib/ollama/runners/cuda_v12_avx/ollama_llama_server
/opt/ollama-bin/lib/ollama/libcublasLt.so.12.4.5.8
/opt/ollama-bin/bin/ollama
/opt/ollama-bin/lib/ollama/runners/cuda_v11_avx/libggml_cuda_v11.so
/opt/ollama-bin/lib/ollama/runners/rocm_avx/ollama_llama_server
/opt/ollama-bin/lib/ollama/runners/cpu_avx2/ollama_llama_server
/opt/ollama-bin/lib/ollama/runners/rocm_avx/libggml_rocm.so
/opt/ollama-bin/lib/ollama/runners/cuda_v12_avx/libggml_cuda_v12.so
 * QA Notice: Unresolved soname dependencies:
 * 
 *  /opt/ollama-bin/lib/ollama/runners/rocm_avx/libggml_rocm.so: libamdhip64.so.6 libhipblas.so.2 librocblas.so.4
 *  /opt/ollama-bin/lib/ollama/runners/rocm_avx/ollama_llama_server: libamdhip64.so.6 libhipblas.so.2 librocblas.so.4
 * 
>>> Installing (3 of 3) app-misc/ollama-bin-0.5.4-r2::strix-overlay
* checking 23 files for package collisions
>>> Merging app-misc/ollama-bin-0.5.4-r2 to /
--- /usr/
--- /usr/lib/
--- /usr/lib/systemd/
--- /usr/lib/systemd/system/
>>> /usr/lib/systemd/system/ollama.service
--- /usr/bin/
--- /var/
--- /var/log/
--- /var/log/ollama/
>>> /var/log/ollama/.keep_app-misc_ollama-bin-0
--- /opt/
--- /opt/ollama-bin/
>>> /opt/ollama-bin/lib/
>>> /opt/ollama-bin/lib/ollama/
>>> /opt/ollama-bin/lib/ollama/libcublas.so.11.5.1.109
>>> /opt/ollama-bin/lib/ollama/runners/
>>> /opt/ollama-bin/lib/ollama/runners/cuda_v12_avx/
>>> /opt/ollama-bin/lib/ollama/runners/cuda_v12_avx/libggml_cuda_v12.so
>>> /opt/ollama-bin/lib/ollama/runners/cuda_v12_avx/ollama_llama_server
>>> /opt/ollama-bin/lib/ollama/runners/cpu_avx/
>>> /opt/ollama-bin/lib/ollama/runners/cpu_avx/ollama_llama_server
>>> /opt/ollama-bin/lib/ollama/runners/cuda_v11_avx/
>>> /opt/ollama-bin/lib/ollama/runners/cuda_v11_avx/libggml_cuda_v11.so
>>> /opt/ollama-bin/lib/ollama/runners/cuda_v11_avx/ollama_llama_server
>>> /opt/ollama-bin/lib/ollama/runners/cpu_avx2/
>>> /opt/ollama-bin/lib/ollama/runners/cpu_avx2/ollama_llama_server
>>> /opt/ollama-bin/lib/ollama/runners/rocm_avx/
>>> /opt/ollama-bin/lib/ollama/runners/rocm_avx/libggml_rocm.so
>>> /opt/ollama-bin/lib/ollama/runners/rocm_avx/ollama_llama_server
>>> /opt/ollama-bin/lib/ollama/libcublasLt.so.12.4.5.8
>>> /opt/ollama-bin/lib/ollama/libcublasLt.so.11.5.1.109
>>> /opt/ollama-bin/lib/ollama/libcublasLt.so.12 -> ./libcublasLt.so.12.4.5.8
>>> /opt/ollama-bin/lib/ollama/libcublas.so.12.4.5.8
>>> /opt/ollama-bin/lib/ollama/libcudart.so.11.3.109
>>> /opt/ollama-bin/lib/ollama/libcudart.so.12.4.127
>>> /opt/ollama-bin/lib/ollama/libcublas.so.11 -> libcublas.so.11.5.1.109
>>> /opt/ollama-bin/bin/
>>> /opt/ollama-bin/bin/ollama
>>> /opt/ollama-bin/lib/ollama/libcudart.so.12 -> libcudart.so.12.4.127
>>> /opt/ollama-bin/lib/ollama/libcudart.so.11.0 -> libcudart.so.11.3.109
>>> /opt/ollama-bin/lib/ollama/libcublas.so.12 -> ./libcublas.so.12.4.5.8
>>> /opt/ollama-bin/lib/ollama/libcublasLt.so.11 -> libcublasLt.so.11.5.1.109
>>> /usr/bin/ollama -> ../../opt/ollama-bin/bin/ollama
*
* Quick guide:
*
* $ ollama serve (standalone,systemd,openrc)
* $ ollama run llama3:70b (client)
*
* Browse available models at: https://ollama.com/library/
*
>>> app-misc/ollama-bin-0.5.4-r2 merged.
>>> Recording app-misc/ollama-bin in "world" favorites file...
>>> Completed (3 of 3) app-misc/ollama-bin-0.5.4-r2::strix-overlay
* Messages for package acct-group/ollama-0-r1:
* Group ollama already exists
* Messages for package acct-user/ollama-0-r1:
* User ollama already exists
* Updating user ollama
>>> Auto-cleaning packages...
>>> No outdated packages were found on your system.
* GNU info directory index is up-to-date.
* IMPORTANT: 1 news items need reading for repository 'gentoo'.
* Use eselect news read to view new items.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment