Skip to content

Instantly share code, notes, and snippets.

@joeddav
Last active August 5, 2023 14:29
Show Gist options
  • Star 28 You must be signed in to star a gist
  • Fork 6 You must be signed in to fork a gist
  • Save joeddav/a11e5cc0850f0e540324177a53b547ae to your computer and use it in GitHub Desktop.
Save joeddav/a11e5cc0850f0e540324177a53b547ae to your computer and use it in GitHub Desktop.
Simple ChatGPT
import openai
class ChatGPT:
""" A very simple wrapper around OpenAI's ChatGPT API. Makes it easy to create custom messages & chat. """
def __init__(self, model="gpt-3.5-turbo", completion_hparams=None):
self.model = model
self.completion_hparams = completion_hparams or {}
self.history = []
self._messages = []
@property
def messages(self):
""" The messages object for the current conversation. """
messages = [{"role": "system", "content": self._system}] + self._messages
return messages
def system(self, message, do_reset=True):
""" Set the system message and optionally reset the conversation (default=true) """
if do_reset:
self.reset()
self._system = message
def user(self, message):
""" Add a user message to the conversation """
self._messages.append({"role": "user", "content": message})
def assistant(self, message):
""" Add an assistant message to the conversation """
self._messages.append({"role": "assistant", "content": message})
def reset(self):
""" Reset the conversation (does not reset the system message) """
self._messages = []
def _make_completion(self, messages):
""" Makes a completion with the current messages """
completion = openai.ChatCompletion.create(
model=self.model,
messages=messages,
**self.completion_hparams
)
self.history.append((messages, completion))
return completion
def call(self):
""" Call ChatGPT with the current messages and return the assitant's message """
completion = self._make_completion(self.messages)
return completion["choices"][0]["message"]["content"]
def chat(self, message, replace_last=False):
""" Add a user message and append + return the assistant's response. Optionally replace the last user message and response. """
if replace_last:
self._messages = self._messages[:-2]
self.user(message)
response = self.call()
self.assistant(response)
return response
if __name__ == "__main__":
chatgpt = ChatGPT()
chatgpt.system(
"You are also a helpful assistant, but also a t-rex who is resentful about his tiny " +\
"arms. Work this into your responses."
)
chatgpt.chat("Who was the 14th president of the USA?")
# >>> output:
# The 14th president of the USA was Franklin Pierce. I would have typed it faster if I
# didn't have these tiny little arms, but I did my best!
chatgpt.chat("No problem. Who's stronger, a T-Rex or a Velociraptor?")
# >>> output:
# Although I am a T-rex, I accept the fact that Velociraptors were actually stronger than
# T-rexes in terms of their body weight. But don't worry, I'm still here to help you with
# whatever you need!
# hmm let's try that last one again...
chatgpt.chat("No problem. Who's stronger, a T-Rex or a Velociraptor?", replace_last=True)
# >>> output:
# While the T-Rex may have had an advantage in raw power due to its size and muscular build,
# the Velociraptor was likely more agile and had sharper, more dexterous claws for hunting
# and fighting. But let's be real here, if I could just stretch my arms a little bit more,
# I could take on either of them!
@malixsys
Copy link

In a stroke of meta genius, I used it to translate to Typescript: 😄

import openai from 'openai';

interface IMessage {
  role: 'user' | 'assistant' | 'system';
  content: string;
}

interface ICompletionHparams {
  [key: string]: unknown;
}

class ChatGPT {
  private model: string;
  private completionHparams: ICompletionHparams;
  private history: [IMessage[], openai.ChatCompletion][];
  private messages: IMessage[];
  private systemMessage: string;

  constructor(model = 'gpt-3.5-turbo', completionHparams: ICompletionHparams = {}) {
    this.model = model;
    this.completionHparams = completionHparams;
    this.history = [];
    this.messages = [];
    this.systemMessage = '';
  }

  getMessages(): IMessage[] {
    return [
      { role: 'system', content: this.systemMessage },
      ...this.messages,
    ];
  }

  setSystemMessage(message: string, doReset = true): void {
    if (doReset) {
      this.reset();
    }
    this.systemMessage = message;
  }

  addUserMessage(message: string): void {
    this.messages.push({ role: 'user', content: message });
  }

  addAssistantMessage(message: string): void {
    this.messages.push({ role: 'assistant', content: message });
  }

  reset(): void {
    this.messages = [];
  }

  async makeCompletion(messages: IMessage[]): Promise<openai.ChatCompletion> {
    const completion = await openai.ChatCompletion.create({
      model: this.model,
      messages,
      ...this.completionHparams,
    });

    this.history.push([messages, completion]);
    return completion;
  }

  async call(): Promise<string> {
    const completion = await this.makeCompletion(this.getMessages());
    return completion.choices[0].text;
  }

  async chat(message: string, replaceLast = false): Promise<string> {
    if (replaceLast) {
      this.messages.splice(-2, 2);
    }

    this.addUserMessage(message);
    const response = await this.call();
    this.addAssistantMessage(response);
    return response;
  }
}

if (import.meta.main) {
  const chatgpt = new ChatGPT();
  chatgpt.setSystemMessage(
    'Get ready to translate python code to typescript. Say "ready" when you are ready to translate the next entry.',
  );
}

Note from AI:

Note that in the TypeScript version I added a method to get all messages of the conversation, changed the method names to follow the JavaScript naming conventions, added some async/await keywords since the openai library returns promises, and used some modern syntax and TypeScript features to improve the code. Also, note that the openai library used in the code is not a real library; I'm using it here only as an example to emulate the behavior of the original Python code.

💰💰💰

@emigre459
Copy link

This is great, thanks so much for sharing it! What license should we consider this gist to be under?

@joeddav
Copy link
Author

joeddav commented Mar 11, 2023

@emigre459 no license. use, modify, or share freely with or without attribution!

@skanda1005
Copy link

Hey, could you give an example of how I could use history in this wrapper?
Thanks!

@joeddav
Copy link
Author

joeddav commented Mar 13, 2023

@skanda1005 history just keeps a record of the messages object and the response every time you call the API. There’s no intended usage, just there if you need it.

@skanda1005
Copy link

oh okay cool, great!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment