Skip to content

Instantly share code, notes, and snippets.

@VictorTaelin
Last active April 8, 2024 02:36
Show Gist options
  • Star 31 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save VictorTaelin/e514844f4df9e5f182b28e5a07e44b17 to your computer and use it in GitHub Desktop.
Save VictorTaelin/e514844f4df9e5f182b28e5a07e44b17 to your computer and use it in GitHub Desktop.
GPTs dumb
A::B is a system with 4 tokens: `A#`, `#A`, `B#` and `#B`.
An A::B program is a sequence of tokens. Example:
B# A# #B #A B#
To *compute* a program, we must rewrite neighbor tokens, using the rules:
A# #A ... becomes ... nothing
A# #B ... becomes ... #B A#
B# #A ... becomes ... #A B#
B# #B ... becomes ... nothing
In other words, whenever two neighbor tokens have their '#' facing each-other,
they must be rewritten according to the corresponding rule. For example, the
first example shown here is computed as:
B# A# #B #A B# =
B# #B A# #A B# =
A# #A B# =
B#
The steps were:
1. We replaced `A# #B` by `#B A#`.
2. We replaced `B# #B` by nothing.
3. We replaced `A# #A` by nothing.
The final result was just `B#`.
Now, consider the following program:
<INSERT_INSTANCE_HERE>
Fully compute it, step by step.
@VictorTaelin
Copy link
Author

I'm editing the prompt to make clear that I'm talking about the problem, not the random 7-token instance I've typed as an example. I don't understand people interpreted it as the actual challenge and apologize to anyone who might've interpreted it that way.

@velinxs
Copy link

velinxs commented Apr 8, 2024

Just for clarity, will the LLM be fed this entire prompt wiht our system prompt or just the 12 tokens?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment