Skip to content

Instantly share code, notes, and snippets.

@samuelcolvin
Last active March 2, 2024 16:04
Show Gist options
  • Star 7 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save samuelcolvin/a810cd3ba6a6d1f55dacd07d23be3623 to your computer and use it in GitHub Desktop.
Save samuelcolvin/a810cd3ba6a6d1f55dacd07d23be3623 to your computer and use it in GitHub Desktop.
OpenAI powered AI CLI in just a few lines of code - moved to https://github.com/samuelcolvin/aicli
#!/usr/bin/env python3
import os
from datetime import datetime, timezone
from pathlib import Path
from prompt_toolkit import PromptSession
from prompt_toolkit.history import FileHistory
import openai
from rich.console import Console
from rich.markdown import Markdown
from rich.status import Status
__version__ = '0.1.0'
console = Console(width=120)
console.print(f'aicli - OpenAI powered AI CLI v{__version__}', style='green bold', highlight=False)
try:
openai.api_key = os.environ['OPENAI_API_KEY']
except KeyError:
console.print('You must set the OPENAI_API_KEY environment variable', style='red')
exit(1)
now_utc = datetime.now(timezone.utc)
setup = f"""
Help the user by responding to their request, the output should be concise and always written in markdown.
The current date and time is {datetime.now()} {now_utc.astimezone().tzinfo.tzname(now_utc)}.
"""
messages = [{'role': 'system', 'content': setup}]
history = Path().home() / '.openai-prompt-history.txt'
session = PromptSession(history=FileHistory(history))
while True:
try:
text = session.prompt('➤ ')
except (KeyboardInterrupt, EOFError):
break
if not text:
continue
status = Status('[dim]Working on it…[/dim]', console=console)
status.start()
messages.append({'role': 'user', 'content': text})
try:
response = openai.ChatCompletion.create(model='gpt-4', messages=messages)
except KeyboardInterrupt:
break
content = response['choices'][0]['message']['content']
messages.append({'role': 'assistant', 'content': content})
status.stop()
md = Markdown(content)
console.print(md)
@duarteocarmo
Copy link

Requirements:

  • openai
  • rich
  • prompt_toolkit

@duarteocarmo
Copy link

Might add streaming to this!

@samuelcolvin
Copy link
Author

Sounds good, I'd be keen for streaming.

unfortunately the aicli pypi package is taken.

@duarteocarmo
Copy link

Possible names, I like termllm, and seems to be free.

    ChatAI-CLI
    TalkBotUtility
    AIConverse
    ChatStreamAI
    CLI-AIChatter
    NeuroTalkTool
    AIChatBridge
    DialogDriveCLI
    IntelliChatCMD
    AIDialogTerminal
    ChatBoxCLI
    AITalkMate
    TerminalThinker
    AIChatMaster
    ChatterCommand
    NeuralNetNatter
    CommandChatAI
    TerminalTalesAI
    AIChatFlowCLI
    SpeakBotUtility
    NLPChatStation
    CommandConverse
    AITermTalk
    ChatterMateCLI
    AIChatLiner
    TerminalTutorAI
    DialogDirectCLI
    CommandCortexChat
    SpeakToAI-CLI
    ChatAIDriveCLI
    AIWordWise
    CmdChatterBox
    TerminalTeteATete
    CLINeuroNatter
    BotBabbleCLI
    Talk2AIShell
    CLI-ChatCortex
    AITalkTerminal
    ThinkChatCLI
    CommandCompanionAI
    QuickQueryAI-CLI
    DialogDynamoCLI
    ConvoCommander
    BotTalkTerminal
    AINatterNote
    CommandChatterAI
    CLI-BrainBanter
    ChatCraftCLI
    AIConvoCraft
    TerminalTalkerAI

@juftin
Copy link

juftin commented Sep 14, 2023

Nice! @samuelcolvin here's what it would look like to implement streaming with rich.live.Live: https://gist.github.com/juftin/ad4b5fb8bb6b8754928acbf8c8ed3fcf

import openai
from rich.console import Console
from rich.live import Live
from rich.markdown import Markdown

console = Console(width=120)
response = openai.ChatCompletion.create(
    model="gpt-4", messages=messages, stream=True
)
complete_message = ""
markdown = Markdown(complete_message)
with Live(markdown, refresh_per_second=15, console=console) as live:
    for chunk in response:
        if chunk["choices"][0]["finish_reason"] is not None:
            break
        chunk_text = chunk["choices"][0]["delta"].get("content", "")
        complete_message += chunk_text
        live.update(Markdown(complete_message))
Screen.Recording.2023-09-13.at.7.27.03.PM.mov

And if you're interested in going the full TUI route, make sure to check out https://github.com/darrenburns/elia - built with Textual.

@samuelcolvin
Copy link
Author

samuelcolvin commented Sep 14, 2023

I've moved this to a proper pypi package, available via pip install samuelcolvin-aicli, I've added streaming as suggested by @juftin.

See https://github.com/samuelcolvin/aicli.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment