Skip to content

Instantly share code, notes, and snippets.

@simonw
Created March 4, 2024 17:28
Show Gist options
  • Star 32 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save simonw/d3c07969a522226067b8fe099007fe4a to your computer and use it in GitHub Desktop.
Save simonw/d3c07969a522226067b8fe099007fe4a to your computer and use it in GitHub Desktop.
@simonw
Copy link
Author

simonw commented Mar 4, 2024

CleanShot 2024-03-04 at 09 28 08@2x

@eliyastein
Copy link

Hey @simonw - You don't need the "get specified text". My setup looks like this:

Screenshot 2024-03-04 at 12 35 17 PM

You might need the full path to the llm binary. Here's my full script:

escaped_args=""
for arg in "$@"; do
  # Escape single quotes (replace ' with '\'')
  escaped_arg=$(printf '%s\n' "$arg" | sed "s/'/'\\\\''/g")
  # Append the escaped argument to the string, surrounded by single quotes
  escaped_args="$escaped_args '$escaped_arg'"
done

# Use the escaped arguments in your command
result=$(/Users/eliya/Library/Python/3.9/bin/llm -m gpt-4 $escaped_args)

# Handle the result as before
escapedResult=$(echo "$result" | sed 's/\\/\\\\/g' | sed 's/"/\\"/g' | awk '{printf "%s\\n", $0}' ORS='')
osascript -e "display dialog \"$escapedResult\""

At that point, once you save, you should have it available under the services menu:

Screenshot 2024-03-04 at 12 41 40 PM

You can then create a hotkey for it under keyboard shortcuts for ease of use under services:

Screenshot 2024-03-04 at 12 42 41 PM

@jziggas
Copy link

jziggas commented Mar 4, 2024

Hi folks! I love this idea as well but can't seem to get it to work. Not very familiar with Automator.

I get a blank dialogue when I try to run the Quick Action:
image

This is what the Quick Action definition looks like:

escaped_args=""
for arg in "$@"; do
  # Escape single quotes (replace ' with '\'')
  escaped_arg=$(printf '%s\n' "$arg" | sed "s/'/'\\\\''/g")
  # Append the escaped argument to the string, surrounded by single quotes
  escaped_args="$escaped_args '$escaped_arg'"
done

# Use the escaped arguments in your command
result=$(/Users/joshuaziggas/Library/Python/3.9/bin/llm -m gpt-4 $escaped_args)

# Handle the result as before
escapedResult=$(echo "$result" | sed 's/\\/\\\\/g' | sed 's/"/\\"/g' | awk '{printf "%s\\n", $0}' ORS='')
osascript -e "display dialog \"$escapedResult\""
image

I ran brew install llm and llm keys set openai and then confirmed my key works with llm "Ten fun names for a pet pelican"

image

Any ideas?

(Apple M1 Max on Ventura 13.5 22G74)

@maschinenzeitmaschine
Copy link

um… if i have llm installed in a conda environment, how would i need to adapt this?
@simonw, maybe consider supporting this 'officially' (= make it easier to set up)? i am aware this gets annoying with different setups (like mine) but this really sounds insanely useful! (by the way: huge fan of your blog!)

@mbafford
Copy link

mbafford commented Mar 4, 2024

@jziggas I experienced the exact same output when I blindly copied the code with ~eliya in the path. Fixing that to the correct path fixed the issue for me. Are you sure /Users/joshuaziggas/Library/Python/3.9/bin/llm is the correct path for your llm binary?

@eliyastein
Copy link

eliyastein commented Mar 4, 2024

@jziggas - quick action won't work, because there is no argument/context that gets passed along as the prompt. I would save the automator action and try to invoke it from the services context menu in some other app after highlighting some text. Does that work?

Also, to @mbafford's point, make sure the path to the binary is right. You can find it by typing which llm into the terminal.

@eliyastein
Copy link

@maschinenzeitmaschine have you tried hardcoding the path to the llm binary in the shell script part? Not too familiar with Conda, but I think that would work with something like a python venv (for example). I imagine it would be similar?

@jziggas
Copy link

jziggas commented Mar 4, 2024

@mbafford Ugh good catch - since I installed via homebrew that path was /opt/homebrew/bin/llm thanks :)

@maschinenzeitmaschine
Copy link

maschinenzeitmaschine commented Mar 4, 2024

@eliyastein thanks, that did the trick!

for the record:
i got it to run with llm running in conda, below is my script (conda base env, miniconda installed through brew).
note: this does NOT work if llm is set to use a custom directory location, use the default one!

LLM_COMMAND="/System/Volumes/Data/opt/homebrew/Caskroom/miniconda/base/bin/llm"

escaped_args=""
for arg in "$@"; do
  # Escape single quotes (replace ' with '\'')
  escaped_arg=$(printf '%s\n' "$arg" | sed "s/'/'\\\\''/g")
  # Append the escaped argument to the string, surrounded by single quotes
  escaped_args="$escaped_args '$escaped_arg'"
done

# Use the escaped arguments in your command
result=$($LLM_COMMAND -m claude-3-opus $escaped_args)

# Handle the result as before
escapedResult=$(echo "$result" | sed 's/\\/\\\\/g' | sed 's/"/\\"/g' | awk '{printf "%s\\n", $0}' ORS='')
osascript -e "display dialog \"$escapedResult\""

@maschinenzeitmaschine
Copy link

maschinenzeitmaschine commented Mar 4, 2024

p.s. i'm well aware this is out of scope and needs quite some more than a shell script, but wouldn't it be insanely cool if this would not only open a text window with an answer, but an interactive shell? like a chat interface, so i could do follow-up questions? i really love the idea to have an llm available anywhere in the finder and not have to open an app or browser or terminal and copy/paste every time…

@eliyastein
Copy link

@maschinenzeitmaschine - the script can be modified to pop open a terminal that starts a chat with the highlighted text as the system prompt. Something like this:

escaped_args=""
for arg in "$@"; do
  escaped_arg=$(printf '%s\n' "$arg" | sed "s/'/'\\\\''/g")
  escaped_args="$escaped_args '$escaped_arg'"
done

osascript -e "tell application \"Terminal\" to do script \"/usr/local/bin/llm chat -m gpt-4 -s ${escaped_args}\""

@devtanna
Copy link

While this is great I wish I could use this in any app. Currently I cannot access this automator workflow from within slack :/

@eliyastein
Copy link

@devtanna - It works from Slack for me. If you highlight some text in slack, you can access the workflow from the menu Slack -> Services -> LLM, or you can invoke it with a hotkey if you've set one under keyboard shortcuts.

@devtanna
Copy link

oh cool, thanks @eliyastein ! works now :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment