Created
July 21, 2023 20:05
-
-
Save zeke/010b0bf7b475c0d33aa5cb0e9d71599e to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Clone llama.cpp | |
git clone https://github.com/ggerganov/llama.cpp.git | |
cd llama.cpp | |
# Build it | |
LLAMA_METAL=1 make | |
# Download model | |
export MODEL=llama-2-13b-chat.ggmlv3.q4_0.bin | |
wget "https://huggingface.co/TheBloke/Llama-2-13B-chat-GGML/resolve/main/${MODEL}" | |
# Run | |
echo "Prompt: " \ | |
&& read PROMPT \ | |
&& ./main \ | |
-t 8 \ | |
-ngl 1 \ | |
-m ${MODEL} \ | |
--color \ | |
-c 2048 \ | |
--temp 0.7 \ | |
--repeat_penalty 1.1 \ | |
-n -1 \ | |
-p "[INST] ${PROMPT} [/INST]" |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment