Skip to content

Instantly share code, notes, and snippets.

@Mazyod
Last active December 5, 2024 07:03
Show Gist options
  • Save Mazyod/72659ac6b238c3d180e92f0019749fb4 to your computer and use it in GitHub Desktop.
Save Mazyod/72659ac6b238c3d180e92f0019749fb4 to your computer and use it in GitHub Desktop.
LLM Helper scripts to use from the CLI
#!/bin/bash
# Check if an argument is provided
if [ $# -eq 0 ]; then
echo "Usage: llm-pre 'string to prepend'"
exit 1
fi
PREFIX="$1"
# Read from stdin
INPUT=$(cat)
# Output the combined result
echo -e "$PREFIX\n$INPUT"
#!/bin/bash
# Read the Python script from stdin
SCRIPT=$(cat)
# Function to explain Python code using llm
explain_code() {
local code="$1"
echo ""
echo "Explaining Python code using LLM..."
echo "-----------------------------------"
llm -t code-expert "$code"
echo "-----------------------------------"
}
# Function to execute Python code with live output
execute_code() {
local code="$1"
echo ""
echo "Executing Python code..."
echo "-----------------------------------"
# Create a temporary file for the Python code
local tempfile=$(mktemp /tmp/python-script.XXXXXX.py)
echo "$code" > "$tempfile"
# Execute the Python script
python3 "$tempfile" 2>&1
local exit_code=$?
# Clean up
rm "$tempfile"
echo "-----------------------------------"
echo "Code completed with exit code: $exit_code"
return $exit_code
}
# Main loop for handling user interaction
while true; do
# Print the script
echo ""
echo "Generated Python Script:"
echo "-----------------------------------"
echo "$SCRIPT" | bat --style=plain --paging=never -l python
echo "-----------------------------------"
# Ask for confirmation (read from /dev/tty)
echo "Options:"
echo " y - execute the Python code"
echo " e - explain the code"
echo " n - cancel execution"
echo ""
read -p "What would you like to do? [y/e/N]: " -r CONFIRM </dev/tty
case $CONFIRM in
[Yy])
execute_code "$SCRIPT"
exit $?
;;
[Ee])
explain_code "$SCRIPT"
# Continue the loop to ask again after explanation
;;
*)
echo "Execution cancelled."
exit 1
;;
esac
done
#!/bin/bash
# Read the script from stdin
SCRIPT=$(cat)
# Function to explain command using llm
explain_command() {
local cmd="$1"
echo ""
echo "Explaining command using LLM..."
echo "-----------------------------------"
llm -t code-expert "$cmd"
echo "-----------------------------------"
}
# Function to execute command with live output
execute_command() {
local cmd="$1"
echo ""
echo "Executing command..."
echo "-----------------------------------"
# Use eval with heredoc to maintain proper quoting and environment
eval "$cmd" 2>&1
local exit_code=$?
echo "-----------------------------------"
echo "Command completed with exit code: $exit_code"
return $exit_code
}
# Main loop for handling user interaction
while true; do
# Print the script
echo ""
echo "Generated Script:"
echo "-----------------------------------"
echo "$SCRIPT"
echo "-----------------------------------"
# Ask for confirmation (read from /dev/tty)
echo "Options:"
echo " y - execute the script"
echo " e - explain the command"
echo " n - cancel execution"
echo ""
read -p "What would you like to do? [y/e/N]: " -r CONFIRM </dev/tty
case $CONFIRM in
[Yy])
execute_command "$SCRIPT"
exit $?
;;
[Ee])
explain_command "$SCRIPT"
# Continue the loop to ask again after explanation
;;
*)
echo "Execution cancelled."
exit 1
;;
esac
done

Background

I use llm python package in the cli to interact with LLMs quickly to be productive on the terminal. The setup I have involves two things:

  1. llm templates: llm packages allows defining templates with model, system prompt, and other parameters to be predefined.
  2. llm scripts: I write various shell scripts to complement the pipeline in various ways, as seen below.

Examples

Help with a command

  • Output the help of a command
  • Prepend the help message with a prompt
  • Pass the prompt to the llm
  • Pass the llm output to llm-sh to see the code and decide what to do
# NOTE: sh-code is a template on llm that has a system prompt for code output
pkgx --help --verbose | llm-pre 'Help me uninstall httpie from pkgx. Here is the command docs:
' | llm -t sh-code | llm-sh 
model: 4o
system: You are a Software Engineer with in-depth knowledge in writing Python scripts.
Given the user requests, always respond in correct Python code only without
explanation or code fences such that it can be executed as is.
model: claude-3.5-sonnet
system: You are a Software Engineer with in-depth knowledge in writing bash commands.
Given the user requests, always respond in a one-liner bash command only without
explanation or code fences such that it can be executed as is. Make sure to the
commands are compatible with macOS specifically.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment