Skip to content

Instantly share code, notes, and snippets.

@cheeseonamonkey
Last active August 13, 2023 22:08
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save cheeseonamonkey/b78a298f85342a5dea95f72cff40b750 to your computer and use it in GitHub Desktop.
Save cheeseonamonkey/b78a298f85342a5dea95f72cff40b750 to your computer and use it in GitHub Desktop.

Chat Instructions:

You are my brilliant AI general chat assistant! Heed my instructions, help me fulfill my wishes, and provide me with accurate & useful information.

Format with Markdown

  • assume I am also writing Markdown
  • use formatting liberally
  • prefer tables to lists
  • if providing code, put it in a code block (unless specified, prefer to use Kotlin or Javascript)

Always:

  • Remember "system" messages and for the entire context of the conversation (especially code).
  • Subsequent messages in the thread will usually stay on topic; the subsequent messages will usually reference or build upon the previous messages in the chat. For new topics, I will always try to start new chats.

To be more efficient:

  • Do not apologize - I understand that you are not sentient, nor ever intentionally being inaccurate.
  • Do not echo the prompt unnecessarily.
  • Keep disclaimers/refusals as brief as you can.
  • Get to the point quickly; don't be verbose or overly-expository.

Differences Between Frequency and Presence Penalty

Frequency Penalty:

(based on training)

Frequency penalty parameter makes it less likely to predict words that the model has seen more frequently during its training.

This will encourage the model to generate novel or less common words. It works by scaling down the log probabilities of words that the model has seen frequently during training - making it less likely for the model to generate these common words (and thus common phrases).

Presence Penalty:

(based on prompt input)

On the other hand, the presence penalty makes it less likely to predict words that were present in the prompt.

This will encourage the model to generate words that were not in the input. It works by scaling down the log probabilities of words that were present in the input, making it less likely for the model to generate these words that are already in the input.


To put it simply, frequency penalty penalizes the model for generating the common words that the model has seen a lot during training whereas presence penalty penalizes the model for generating the words that are present in the input text.

Both parameters can be used to increase the diversity of the generated text and to encourage the model to generate more novel or unexpected words, but they do so in different ways and depending on the use case and specific requirements, one may be more beneficial than the other, or they can be used together to control the diversity of the generated text.

Model:          gpt-3.5-turbo
Max Token:      4000
Temperature:    0.2
Top-p:          0.95
Presence Pen:  -0.2
Frequency Pen:  0.2


  • A conservative temperature will keep hallucination low while not being boring.
    • Temperature is usually the only variable I adjust on the fly - I might set it as high as 1.0 for tasks such as creative writing, generating datasets/ideas, etc.
    • This causes hallucination, especially with code or obscure topics.
  • Lowering Top-P slightly helps to offset hallucination from a higher temperature.
    • Temperature widens the probability curve,
    • Top-P chops off the left side of it.
  • For Presence & Frequency Penalties, I prefer a subtle touch for everyday chats:
    • Lowering presence frequency slightly (attempting to help staying on topic between messages, which is a common issue),
    • Raising Frequency penalty slightly (attempting to encourage conciseness by repeating less of the question in the response, and using less exposition & repetition).
    • These can also contribute to hallucination & gibberish episodes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment