Skip to content

Instantly share code, notes, and snippets.

@ncrmro
Created June 24, 2024 13:18
Show Gist options
  • Save ncrmro/935639b2362a3fc12e80477254038b52 to your computer and use it in GitHub Desktop.
Save ncrmro/935639b2362a3fc12e80477254038b52 to your computer and use it in GitHub Desktop.
A minimal implmentation of YAMs (declartive LLM chats)
#!/bin/bash
set -e
# Assign the first argument to yaml_file variable
yaml_file=$1
model="mistral"
ollama_endpoint=http://localhost:11434/api/chat
# Check if the YAML file path is provided
if [ -z "$1" ]; then
echo "Usage: $0 <path_to_yaml_file>"
exit 1
fi
# Check if the file exists
if [ ! -f "$yaml_file" ]; then
echo "File not found: $yaml_file"
exit 1
fi
# Note some snap versions of yq can't read from the tmp directory.
temp_dir=$(mktemp --directory)
request=${temp_dir}/request.json
response=${temp_dir}/res.json
yq eval "{\"model\": \"${model}\", \"stream\": false} + ." "${yaml_file}" -o json > "${request}"
curl -X POST -H "Content-Type: application/json" -d "@${request}" "${ollama_endpoint}" -o "${response}"
cat ${response}
response_message=$(yq eval --input-format=json --output-format=json ".message" ${response})
yq eval --input-format=json --output-format=yaml -i ".messages += [${response_message}]" "${request}"
open ${request}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment