Please use https://github.com/Aaronik/ai-functions instead.
Dead simple zsh functions that integrate openai LLMs into your command line.
Command | Description |
---|---|
ai |
Create bash one liners. You tell it what you want the bash command to do, and it puts it right onto the command line for you. |
ai_models |
Enumerate what models your OPENAI_API_KEY has access to. It just lists out for you all the openai models you currently have access to, easy as pie. |
Example Command | Description |
---|---|
ai list all open ports |
Generates a bash one-liner to list all open ports. |
ai remove all html from html_file.html |
Generates a bash one-liner to remove all HTML from a specified file. |
ai show me the weather in my local region |
Generates a bash one-liner to show the weather in your local region. |
ai watch star wars in the terminal |
Generates a bash one-liner to watch Star Wars in the terminal. |
You can also pipe into ai
to give it additional context.
Example Command | Description |
---|---|
lsusb | ai disconnect from all bluetooth devices |
Pipes the output of lsusb into ai to generate a command to disconnect from all Bluetooth devices. |
ifconfig | ai port knock my local machine |
Pipes the output of ifconfig into ai to generate a command to perform a port knock on your local machine. |
Copy and paste the following functions into your ~/.zshrc file (or, of course, whatever convoluted dotfile management system you've invented for yourself).
function ai() {
local system_content="uname -0: $(uname -o), uname -r: $(uname -r)."
local text="Write a bash command to $@. Return only the command, no other text. Do not describe what it is. I want to copy/paste exactly what you return and run it directly in a terminal. Do not include quotes or back ticks in the answer."
# Append piped in content
if ! [ -t 0 ]; then
# If data is being piped in, read it
piped=$(cat -)
fi
if [ -n "$piped" ] && [ "$piped" != '""' ]; then
text="$text Here is further context for this request: $piped"
fi
# Construct the JSON payload
local json_payload=$(jq -n --arg system_content "$system_content" --arg text "$text" '{
"messages": [
{"role": "system", "content": $system_content},
{"role": "user", "content": $text}
],
"max_tokens": 303,
"temperature": 0,
"model": "gpt-4-1106-preview"
}')
local response=$(curl -s -X POST \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
--data "$json_payload" \
https://api.openai.com/v1/chat/completions
)
local completion=$(echo "$response" | jq -r '.choices[0].message.content')
# Guard for openai error response. Inform and return execution.
if [[ $completion == "null" || $completion == "" ]]; then
echo "$response"
false
return
fi
print -z "$completion"
}
function ai_models() {
curl -s --header "Authorization: Bearer $OPENAI_API_KEY" https://api.openai.com/v1/models \
| jq -r '.data[].id'
}
You'll need to have the OPENAI_API_KEY
environment variable visible from where these are called. As it stands, you'll also need access to the gpt-4
based models, which you can get by prepaying for the openai API at https://platform.openai.com/account/billing/overview.
You can see a video demo of the ai()
function here: https://youtu.be/a_5-7qCuzpw