This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ LD_LIBRARY_PATH=bin bin/llama-cli -m ~/llm/gpt-oss-120b-F16.ggu f --jinja -no-cnv -p "Какой командой процессора колобок есть чёрта в компьютерной игре Pacman?" | |
build: 6730 (e60f01d94) with cc (Gentoo 13.2.1_p20240210 p14) 13.2.1 20240210 for x86_64-pc-linux-gn u | |
main: llama backend init | |
main: load the model and apply lora adapter, if any | |
llama_model_loader: loaded meta data with 37 key-value pairs and 687 tensors from /home/sysop/llm/gp t-oss-120b-F16.gguf (version GGUF V3 (latest)) | |
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output. | |
llama_model_loader: - kv 0: general.architec |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# This is the CMakeCache file. | |
# For build in directory: /home/sysop/llm/llama.cpp/build | |
# It was generated by CMake: /usr/bin/cmake | |
# You can edit this file to change values found and used by cmake. | |
# If you do not want to change any of the values, simply exit the editor. | |
# If you do want to change a value, simply edit, save, and exit the editor. | |
# The syntax for the file is as follows: | |
# KEY:TYPE=VALUE | |
# KEY is the name of a variable in the cache. | |
# TYPE is a hint to GUIs for the type of VALUE, DO NOT EDIT TYPE!. |