Skip to content

Instantly share code, notes, and snippets.

@codeprimate
Last active January 29, 2024 20:34
Show Gist options
  • Save codeprimate/a48f1e034125b6e8b9a45fbba6772c15 to your computer and use it in GitHub Desktop.
Save codeprimate/a48f1e034125b6e8b9a45fbba6772c15 to your computer and use it in GitHub Desktop.
#!/bin/bash
# Usage:
# ~/bin/ai <<< "Why is the sky blue?"
# ls -l | ~/bin/ai "Which files listed above have symlinks?"
context_size=4000
n_predict=2000
temperature=0.5
prompt_prefix="\\nUSER: "
prompt_suffix="\\nAI: "
llama_path=~/services/llama.cpp
prompt_file=$llama_path/default_prompt.txt
model=~/services/textgen/models/nous-hermes-2-solar-10.7b.Q8_0.4k.gguf
# Read prompt from STDIN
if [ -n "$PROMPT_FILE" ]; then
prompt=$(cat $PROMPT_FILE)
else
prompt=$(cat $prompt_file)
fi
prompt+="\\n$prompt_prefix"
while IFS= read -r line; do
prompt+="$line\n"
done
prompt+=$1
prompt+="\\n"
prompt+=$prompt_suffix
prompt+="\\n\\n"
prompt=${prompt%\\n}
cd $llama_path
./main --temp $temperature --n-predict $n_predict -e -c $context_size -m $model -p "$prompt" --prompt-cache-all 2>/dev/null | \
sed "1,/AI: /d"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment